• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Real Time Hardware Accelerated DirectX Ray Tracing - How Xbox Series X will show a generational leap in Video Game Graphics

Bernkastel

Ask me about my fanboy energy!
035nMx.png

In March 19, 2018 Microsoft announced DirectX Raytracing with demos from the Futuremark demo from Epic and SEED demo from EA.
3D Graphics is a Lie
The Project Scarlet reveal trailer already confirmed that Xbox Series X supports Real Time Hardware Accelerated Ray Tracing and finally after the Digital Foundry hands on
Hardware Accelerated DirectX Raytracing (DXR) – From improved lighting, shadows and reflections as well as more realistic acoustics and spatial audio, raytracing enables developers to create more physically accurate worlds. For the very first time in a game console, Xbox Series X includes support for high performance, hardware accelerated raytracing. Xbox Series X uses a custom-designed GPU leveraging the latest innovation from our partners at AMD and built in collaboration with the same team who developed DirectX Raytracing. Developers will be able to deliver incredibly immersive visual and audio experiences using the same techniques on PC and beyond.
With a blog post from AMD
  • A customized GPU based on next generation AMD RDNA 2 gaming architecture with 52 compute units to deliver 12 TFLOPS of single precision performance, enabling increases in graphics performance and hardware accelerated DirectX Raytracing and Variable Rate Shading.
...
The Xbox Series X SoC was architected for the next generation of DirectX API extensions with hardware acceleration for Raytracing, and Variable Rate Shading. Raytracing is especially one of the most visible new features for gamers, which simulates the properties of light, in real time, more accurately than any technology before it. Realistic lighting completely changes a game and the gamer’s perception.
...
The Xbox Series X is the biggest generational leap of SoC and API design that we’ve done with Microsoft, and it’s an honor for AMD to partner with Microsoft for this endeavor.
Another big game changer is DirectML.
DirectML – Xbox Series X supports Machine Learning for games with DirectML, a component of DirectX. DirectML leverages unprecedented hardware performance in a console, benefiting from over 24 TFLOPS of 16-bit float performance and over 97 TOPS (trillion operations per second) of 4-bit integer performance on Xbox Series X. Machine Learning can improve a wide range of areas, such as making NPCs much smarter, providing vastly more lifelike animation, and greatly improving visual quality.
DirectML was part of Nvidia's SIGGRAPH 2018 tech talk
Forza Horizon 3 demo at 16:06
At 19:06
j4e7MQ.png


At 19:28
j4epdM.png
Performance comparison at 22:38
j4eoqZ.png

j4eYjK.png

j4eLPc.png
We couldn’t write a graphics blog without calling out how DNNs(Deeo Neural Networks) can help improve the visual quality and performance of games. Take a close look at what happens when NVIDIA uses ML to up-sample this photo of a car by 4x. At first the images will look quite similar, but when you zoom in close, you’ll notice that the car on the right has some jagged edges, or aliasing, and the one using ML on the left is crisper. Models can learn to determine the best color for each pixel to benefit small images that are upscaled, or images that are zoomed in on. You may have had the experience when playing a game where objects look great from afar, but when you move close to a wall or hide behind a crate, things start to look a bit blocky or fuzzy – with ML we may see the end of those types of experiences.
twocars.png

carcompare.png

DirectML Technology Overview
We know that performance is a gamer’s top priority. So, we built DirectML to provide GPU hardware acceleration for games that use Windows Machine Learning. DirectML was built with the same principles of DirectX technology: speed, standardized access to the latest in hardware features, and most importantly, hassle-free for gamers and game developers – no additional downloads, no compatibility issues – everything just works. To understand why how DirectML fits within our portfolio of graphics technology, it helps to understand what the Machine Learning stack looks like and how it overlaps with graphics.
WinMLArchitecture.png

DirectML is built on top of Direct3D because D3D (and graphics processors) are very good for matrix math, which is used as the basis of all DNN models and evaluations. In the same way that High Level Shader Language (HLSL) is used to execute graphics rendering algorithms, HLSL can also be used to describe parallel algorithms of matrix math that represent the operators used during inference on a DNN. When executed, this HLSL code receives all the benefits of running in parallel on the GPU, making inference run extremely efficiently, just like a graphics application.

In DirectX, games use graphics and compute queues to schedule each frame rendered. Because ML work is considered compute work, it is run on the compute queue alongside all the scheduled game work on the graphics queue. When a model performs inference, the work is done in D3D12 on compute queues. DirectML efficiently records command lists that can be processed asynchronously with your game. Command lists contain machine learning code with instructions to process neurons and are submitted to the GPU through the command queue. This helps to integrate in machine learning workloads with graphics work, which makes bringing ML models to games more efficient and it gives game developers more control over synchronization on the hardware.
PS : DirectML is the part of WindowsML meant for gaming.




Machine learning is a feature we've discussed in the past, most notably with Nvidia's Turing architecture and the firm's DLSS AI upscaling. The RDNA 2 architecture used in Series X does not have tensor core equivalents, but Microsoft and AMD have come up with a novel, efficient solution based on the standard shader cores. With over 12 teraflops of FP32 compute, RDNA 2 also allows for double that with FP16 (yes, rapid-packed math is back). However, machine learning workloads often use much lower precision than that, so the RDNA 2 shaders were adapted still further.
"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."
...
However, the big innovation is clearly the addition of hardware accelerated ray tracing. This is hugely exciting and at Digital Foundry, we've been tracking the evolution of this new technology via the DXR and Vulkan-powered games we've seen running on Nvidia's RTX cards and the console implementation of RT is more ambitious than we believed possible.
RDNA 2 fully supports the latest DXR Tier 1.1 standard, and similar to the Turing RT core, it accelerates the creation of the so-called BVH structures required to accurately map ray traversal and intersections, tested against geometry. In short, in the same way that light 'bounces' in the real world, the hardware acceleration for ray tracing maps traversal and intersection of light at a rate of up to 380 billion intersections per second.
"Without hardware acceleration, this work could have been done in the shaders, but would have consumed over 13 TFLOPs alone," says Andrew Goossen. "For the Series X, this work is offloaded onto dedicated hardware and the shader can continue to run in parallel with full performance. In other words, Series X can effectively tap the equivalent of well over 25 TFLOPs of performance while ray tracing."
It is important to put this into context, however. While workloads can operate at the same time, calculating the BVH structure is only one component of the ray tracing procedure. The standard shaders in the GPU also need to pull their weight, so elements like the lighting calculations are still run on the standard shaders, with the DXR API adding new stages to the GPU pipeline to carry out this task efficiently. So yes, RT is typically associated with a drop in performance and that carries across to the console implementation, but with the benefits of a fixed console design, we should expect to see developers optimise more aggressively and also to innovate. The good news is that Microsoft allows low-level access to the RT acceleration hardware.
"[Series X] goes even further than the PC standard in offering more power and flexibility to developers," reveals Goossen. "In grand console tradition, we also support direct to the metal programming including support for offline BVH construction and optimisation. With these building blocks, we expect ray tracing to be an area of incredible visuals and great innovation by developers over the course of the console's lifetime."
The proof of the pudding is in the tasting, of course. During our time at the Redmond campus, Microsoft demonstrated how fully featured the console's RT features are by rolling out a very early Xbox Series X Minecraft DXR tech demo, which is based on the Minecraft RTX code we saw back at Gamescom last year and looks very similar, despite running on a very different GPU. This suggests an irony of sorts: base Nvidia code adapted and running on AMD-sourced ray tracing hardware within Series X. What's impressive about this is that it's fully path-traced. Aside from the skybox and the moon in the demo we saw, there are no rasterised elements whatsoever. The entire presentation is ray traced, demonstrating that despite the constraints of having to deliver RT in a console with a limited power and silicon budget, Xbox Series X is capable of delivering the most ambitious, most striking implementation of ray tracing - and it does so in real time.
Minecraft DXR is an ambitious statement - total ray tracing, if you like - but we should expect to see the technology used in very different ways. "We're super excited for DXR and the hardware ray tracing support," says Mike Rayner, technical director of the Coalition and Gears 5. "We have some compute-based ray tracing in Gears 5, we have ray traced shadows and the [new] screen-space global illumination is a form of ray traced screen-based GI and so, we're interested in how the ray tracing hardware can be used to take techniques like this and then move them out to utilising the DXR cores.
"I think, for us, the way that we've been thinking about it is as we look forward, we think hybrid rendering between traditional rendering techniques and then using DXR - whether for shadows or GI or adding reflections - are things that can really augment the scene and [we can] use all of that chip to get the best final visual quality."
...
Microsoft ATG principal software engineer Claude Marais showed us how a machine learning algorithm using Gears 5's state-of-the-art HDR implementation is able to infer a full HDR implementation from SDR content on any back-compat title. It's not fake HDR either, Marais rolled out a heatmap mode showing peak brightness for every on-screen element, clearly demonstrating that highlights were well beyond the SDR range.
Journalist: How hard is game development going to get for the next generation? For PlayStation 5 and Xbox Series X? The big problem in the past was when you had to switch to a new chip, like the Cell. It was a disaster. PlayStation 3 development was painful and slow. It took years and drove up costs. But since you’re on x86, it shouldn’t happen, right? A lot of those painful things go away because it’s just another faster PC. But what’s going to be hard? What’s the next bar that everybody is going to shoot for that’s going to give them a lot of pain, because they’re trying to shoot too high?
Gwertzman:
You were talking about machine learning and content generation. I think that’s going to be interesting. One of the studios inside Microsoft has been experimenting with using ML models for asset generation. It’s working scarily well. To the point where we’re looking at shipping really low-res textures and having ML models uprez the textures in real time. You can’t tell the difference between the hand-authored high-res texture and the machine-scaled-up low-res texture, to the point that you may as well ship the low-res texture and let the machine do it.
Journalist: Can you do that on the hardware without install time?
Gwertzman:
Not even install time. Run time.
Journalist: To clarify, you’re talking about real time, moving around the 3D space, level of detail style?
Gwertzman:
Like literally not having to ship massive 2K by 2K textures. You can ship tiny textures.
Journalist: Are you saying they’re generated on the fly as you move around the scene, or they’re generated ahead of time?
Gwertzman:
The textures are being uprezzed in real time.
Journalist: So you can fit on one blu-ray.
Gwertzman:
The download is way smaller, but there’s no appreciable difference in game quality. Think of it more like a magical compression technology. That’s really magical. It takes a huge R&D budget. I look at things like that and say — either this is the next hard thing to compete on, hiring data scientists for a game studio, or it’s a product opportunity. We could be providing technologies like this to everyone to level the playing field again.
Journalist: Where does the source data set for that come from? Do you take every texture from every game that ships under Microsoft Game Studios?
Gwertzman:
In this case, it only works by training the models on very specific sets. One genre of game. There’s no universal texture map. That would be kind of magical. It’s more like, if you train it on specific textures and then you — it works with those, but it wouldn’t work with a whole different set.
Journalist: So you still need an artist to create the original set.
Journalist: Are there any legal considerations around what you feed into the model?
Gwertzman:
It’s especially good for photorealism, because that adds tons of data. It may not work so well for a fantasy art style. But my point is that I think the fact that that’s a technology now — game development has always been hard in terms of the sheer number of disciplines you have to master. Art, physics, geography, UI, psychology, operant conditioning. All these things we have to master. Then we add backend services and latency and multiplayer, and that’s hard enough. Then we added microtransactions and economy management and running your own retail store inside your game. Now we’re adding data science and machine learning. The barrier seems to be getting higher and higher.
That’s where I come in. At heart, Microsoft is a productivity company. Our employee badge says on the back, the company mission is to help people achieve more. How do we help developers achieve more? That’s what we’re trying to figure out.
Xbox Series X also features support for DirectML. Are you planning to use the Machine Learning API in some ways for Scorn?
DirectML and NVIDIA's DLSS 2.0 are very interesting solutions when the game is not hitting the desired performance and it feels like these solutions could help players with weaker systems quite substantially. A lot of these new features have been at our disposal for a very limited amount of time. We will try our best to give players as many options as possible.
 
Last edited:

Bernkastel

Ask me about my fanboy energy!
Games that will support DirectX Ray-Tracing
 
Last edited:
Nice thread and marketing summary but I'm not quite sure what I'm supposed to discuss here?
I still think we'll only see RT used in isolated functions. I hope it does not negatively impact the 60fps goal they probably have for Halo Infinite :(
 
I’m curious to see how many games truly have full blown RT or just partial RT implementation. Also curious to see how different ways or alternatives to RT develop over next gen to save performance

Nice thread and marketing summary but I'm not quite sure what I'm supposed to discuss here?
I still think we'll only see RT used in isolated functions. I hope it does not negatively impact the 60fps goal they probably have for Halo Infinite :(

Considering Infinite can run on base XBO, I think 4k/60/some RT implementation is realistic for XsX version
 
Last edited:

Xplainin

Banned
Microsoft has really put out the best hardware they could of at this point.
Ray Tracing is great, but I'm really interested to see what they can do with ML. This Gen their new hardware will launch with their new API for the first time.
 

jimbojim

Banned
I’m curious to see how many games truly have full blown RT or just partial RT implementation. Also curious to see how different ways or alternatives to RT develop over next gen to save performance



Considering Infinite can run on base XBO, I think 4k/60/some RT implementation is realistic for XsX version

Minecraft with full path RT was 1080p/unstable 60fps running on much more powerful RTX2080Ti and surely much more powerful CPU. How you can expect 4k/60 with RT on XSX?
 

Keihart

Member
Minecraft with full path RT was 1080p/unstable 60fps running on much more powerful RTX2080Ti and surely much more powerful CPU. How you can expect 4k/60 with RT on XSX?
I don't think any game will use full RT in the near future, it doesn't makes sense. They can still hit that performance and use some RT effects. That would count as using RT.
 

Andodalf

Banned
I’m curious to see how many games truly have full blown RT or just partial RT implementation. Also curious to see how different ways or alternatives to RT develop over next gen to save performance



Considering Infinite can run on base XBO, I think 4k/60/some RT implementation is realistic for XsX version

Given how adamant they were about 60fps for halo 5, it’ll be interesting if they compromise on that for the base XO version of infinite. If it’s 60 I’d expect we see something like 720p, which isn’t ideal in 2020.
 

ZehDon

Member
Microsoft has really put out the best hardware they could of at this point.
Ray Tracing is great, but I'm really interested to see what they can do with ML. This Gen their new hardware will launch with their new API for the first time.
Yeah, ML is a little bit of a dark horse for me. It could end up being a complete waste of silicon, or it could be a really forward-thinking inclusion that helps the Series X punch above its weight. I've been really impressed with some of the up-scaling techniques, and in some cases, I've actually preferred the ML up-scaled image to the native rez counter-part. It's really cool this in the console, and I'm excited to see what gets done with it.
 

Lethal01

Member
I’m curious to see how many games truly have full blown RT or just partial RT implementation. Also curious to see how different ways or alternatives to RT develop over next gen to save performance



Considering Infinite can run on base XBO, I think 4k/60/some RT implementation is realistic for XsX version

by "full blown rt" do you mean Minecraft level? if so, then only indies. We don't have the tech.
 
Last edited:

Romulus

Member
It's all bullshit hype until I see actual gameplay on-screen with AI in a live demo, but even then it'll probably be downgraded.
 

TheAssist

Member
So far the only interesting use of raytracing has been reflections. Shadows are such a minor improvement that you will never see it while actually moving through a world and global illumination can be faked quite well. Sure raytraced its better, but is it worth the performance if most people can hardly see the difference?

Maybe the examples we got so far arent showing the full impact yet, but raytracing right now (at least FULLY raytraced images) dont seem worth the performance it costs. But I can see raytracing certain aspects like reflections, which do significantly improve realism and picture quality.
 

darkinstinct

...lacks reading comprehension.
by "full blown rt" do you mean Minecraft level? if so, then only indies. We don't have the tech.
Quite a bit too early to say that. We know absolutely nothing about the performance, ease of implementation or anything aside from patents that describe how the hardware works.
 
I wonder if some major features of RT aren't even graphics based e.g. bullet and path detection, no hit boxes required, audio implementations etc. Perhaps the horse power and hardware RT can be put to better use than just making things shinier/prettier.
 

MadViking

Member
Minecraft is not a good example to show RT. The game looks like turd (and that's ok, it's part of it charm) and when you put RT on it, it just looks like polished turd.
A better example for me was that Vega demo. Or show a demo where RT complements physics, show how the light changes when the scene changes in not predetermined way and how the player can interact with it.
 

TheAssist

Member
I would argue that games like Minecraft are probably the best you can use to show people what RTX can do. Everybody knows what vanilla Minecraft looks like, that demo looks crazy.

Sure if you want to compare a new rendering technique with one that has been obsolete 10 years ago. Its like showing how awesome next gen textures will look like next to a wire frame model. You can put any kind of modern rasterized lightning into Mincecraft and it would look miles better than it looks at stock.
I mean people have done that.

One thing I would be more curious about is, how the workflow going to be? UE5 has shown some really good on the fly real time adjustments for lightning, so no pre baking all the time. So I would imagine that raytracing could cut back on some of that kind of busy work. But is the overall process faster than traditional lighting models (faster for the devs to implement i mean)?
 

Sw0pDiller

Banned
XSX ran Minecraft with raytracing after a one man/one day implementation at 1080p/45. With machine learning and optimization 4K/60 should be possible.
You DO know that 4k@60fps is an insane increase of pixel and data over 1080p@45 and therefore it is really "silly" to easily say it "should be possible" with some tweaks right? it's at least 4 times as many pixels and then a whopping increase in fps. At that resolution you'll need serious power. power that won't be available this coming generation.

it's like saying that a car that goes about 200km/h would go 800km/h with "some" tweaks.

no car can drive 800 without a rocket engine screwed on top
 

darkinstinct

...lacks reading comprehension.
You DO know that 4k@60fps is an insane increase of pixel and data over 1080p@45 and therefore it is really "silly" to easily say it "should be possible" with some tweaks right? it's at least 4 times as many pixels and then a whopping increase in fps. At that resolution you'll need serious power. power that won't be available this coming generation.

it's like saying that a car that goes about 200km/h would go 800km/h with "some" tweaks.

no car can drive 800 without a rocket engine screwed on top
That's why nobody will brute force it but use techniques like DLSS to achieve that goal.
 

CobraXT

Banned
I am 100 % sure RT will not be a thing in next gen consoles .. maybe it will be worth it in 4-5 years when we see > 40 TF GPUs with 2-3 TB of memory bandwidth .. it's a performance killer and it doesn't look any better than the traditional baked/semi baked lightning methods unless they use full RT
 

Lethal01

Member
Quite a bit too early to say that. We know absolutely nothing about the performance, ease of implementation or anything aside from patents that describe how the hardware works.
It's not, The consoles aren't going to give us 20x the performance of an RTX 2080ti or something like that. Even with DLSS.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Its made by one man, there is a demo on PC and its amazing


This is what I really like seeing thanks to consoles and PC’s raising the bar: small developer teams can achieve results it would have taken tons of hacks and clever tricks by much much larger teams before.

The improvements in tech and real-time rendering can be noticed in tons of fields: even in TV shows where the CGI effects of something like The Mandalorian have nothing really to envy the theatrical Star Wars releases in terms of graphical presentation and it was using Unreal Engine 4.x for most of its scenes... imagine UE5 in two years on workstation class GPU’s of the time 🤯.
 
Last edited:

jimbojim

Banned
You DO know that 4k@60fps is an insane increase of pixel and data over 1080p@45 and therefore it is really "silly" to easily say it "should be possible" with some tweaks right? it's at least 4 times as many pixels and then a whopping increase in fps. At that resolution you'll need serious power. power that won't be available this coming generation.

it's like saying that a car that goes about 200km/h would go 800km/h with "some" tweaks.

no car can drive 800 without a rocket engine screwed on top
Yep. Next-gen consoles aren't that much powerful.

That's why nobody will brute force it but use techniques like DLSS to achieve that goal.

You mean upscale a game from 1080p to 4k. /s

The XSX demo looked way less complex than the worlds created for Minecraft RTX here, which were used for testing the performance. I would keep expectations in check. Really. Even "ordinary" RT requires some serious power.
 
Last edited:

GymWolf

Member
So what should they use, Quake 2 RTX like nVidia did?
What about some modern game?

Oh look how realistic are these fucking blocks or huge polygons with better lights and shadows...no they still looks like shit even with pixar level of rtx..

I mean, they can feed up minecraft fans with everything, they already play and love the game with the worst art design ever, do you think they are not gonna be impressed by rtx??

But for people who think that minecraft looks horrible there is no amount of rtx that can change their minds.

Same with quake 2, it just looks like and old ass game even with rtx, don't kid yourself thinking otherwise.


Project mara looks like a magnificent example of rtx, show more of that or other games that really enhance the realism of the scene with rtx.
 
Last edited:

M1chl

Currently Gif and Meme Champion
What about some modern game?

Oh look how realistic are these fucking blocks or huge polygons with better lights and shadows...no they still looks like shit even with pixar level of rtx..

I mean, they can feed up minecraft fans with everything, they already play and love the game with the worst art design ever, do you think they are not gonna be impressed by rtx??

But for people who think that minecraft looks horrible there is no amount of rtx that can change their minds.

Same with quake 2, it just looks like and old ass game even with rtx, don't kid yourself thinking othetwise.
I think that with Minecraft the result from Minecraft non-RTX to RTX one is quite noticable. If they would use Control for example, people would bitch, they don't see difference.
 

GymWolf

Member
I think that with Minecraft the result from Minecraft non-RTX to RTX one is quite noticable. If they would use Control for example, people would bitch, they don't see difference.
Dude he looks like shit even with rtx cmon...



I saw other videos where they completely change the textures with the addittion of rtx but that is cheating.
The video i posted is official and it only show rtx on or off without other enhancement and it looks horrible and i chuckle every time someone say that it looks impressive tbh.
 
Last edited:

M1chl

Currently Gif and Meme Champion
Dude he looks like shit even with rtx cmon...



I saw other video where they completely change the texture with the addittion of rtx but that is cheating.
The video i posted is official and it only show rtx on or off without other enhancement and it looks horrible and i chuckle every time someone say that it looks impressive tbh.

I am saying the difference. It's clearly noticable. I haven't said anything about "looking good"...
 

GymWolf

Member
I am saying the difference. It's clearly noticable. I haven't said anything about "looking good"...
Of course there is a difference, i never said the contrary, but for me it is the worst example of rtx because the scene still looks like a ps1 game with terrible art design even with rtx on.

How can you enhance the realism in minecraft when the game doesn't look realistic to begin with?
It's like trying to make a pig fuckable just by putting a victoria secret bra on him...
 
Last edited:

M1chl

Currently Gif and Meme Champion
Of course there is a difference, i never said the contrary, but for me it is the worst example of rtx because the scene still looks like a ps1 game with terrible art design even with rtx on.
Yes, but as you might know, people on the internet even with shitty connection will see the difference and that it's more colorfull, etc. I think this approach is smart as how to market it to general public. Because in more advance game it would "hmm and what is the difference?", because they probably learned from games like Metro Exodus, where people were claiming that the non-RTX version was better. And when it comes to Control, DF did some fuckery, that the non-RTX version does not have any shadows, etc to be able to represent the difference. It's tough, when devs learned how to fake stuff quite good.
 

GymWolf

Member
Yes, but as you might know, people on the internet even with shitty connection will see the difference and that it's more colorfull, etc. I think this approach is smart as how to market it to general public. Because in more advance game it would "hmm and what is the difference?", because they probably learned from games like Metro Exodus, where people were claiming that the non-RTX version was better. And when it comes to Control, DF did some fuckery, that the non-RTX version does not have any shadows, etc to be able to represent the difference. It's tough, when devs learned how to fake stuff quite good.
Well as you can see in this very topic, some people is even less impressed by rtx on a game that looks like puke on screen.

I can tune up the saturation on my tv and having a more colorfull minecraft without rtx,is not really that impressive to me, you are not gonna win me with some water reflection or lava reflextion when both lava and water looks like shit.


If they can't demonstrate how rtx change the graphics of modern games then maybe rtx is not ready for console to begin with...
 
Last edited:

M1chl

Currently Gif and Meme Champion
Well as you can see in this very topic, some people is even less impressed by rtx on a game that looks like puke on screen.

I can tune up the saturation on my tv and having a more colorfull minecraft without rtx,is not really that impressive to me, you are not gonna win me with some water reflection or lava reflextion when both lava and water looks like shit.
The thing is that it's probably closest to "full" RTX, so like...it looks like shit, it's tech demo, but every single pixel is generated differently and well this type of implentation is probably not possible in the timeframe. I don't understand why nVidia, did not moneyhatted some bigger studio and create some good game with RTX.
 

GymWolf

Member
The thing is that it's probably closest to "full" RTX, so like...it looks like shit, it's tech demo, but every single pixel is generated differently and well this type of implentation is probably not possible in the timeframe. I don't understand why nVidia, did not moneyhatted some bigger studio and create some good game with RTX.
Because they sell their overpriced gpu even without rtx demonstrations. :ROFLMAO:

But i expect some pretty good demonstration when they are gonna show the 3000 series.
 
D

Deleted member 775630

Unconfirmed Member
Dude he looks like shit even with rtx cmon...



I saw other videos where they completely change the textures with the addittion of rtx but that is cheating.
The video i posted is official and it only show rtx on or off without other enhancement and it looks horrible and i chuckle every time someone say that it looks impressive tbh.

That looks already amazing in Minecraft (imo), looking forward to see what it can bring to big AAA games that strive for realism.
 
Top Bottom