The problems with original console versions of Rage were pinned on i/o bandwidth, but if you solve that problem, you do shift the bottleneck to other points like storage space.It was dropped in Id Tech 7. Only Doom Eternal don't use megatexture. It's not even bandwidth problem but game size one. With megatexture you literally can't use texture in more than one place and that limits how big they actually can be before games is so big that it cant't be fitted into hard drives of players. It also is in opposition to PBR and raytacing so with that technologies being pushed forward it stops being asset and starts to be liability.
It's enabled via a hidden option in the game files, so it's probably in testing of some side or perhaps abandoned. Either way, it's clearly not a finished or optimised implementation.Is it official implementation or some injected one through mod?
Performance impact is still high with current hardware, but dismissing it as a novelty is naive. Many games already look far better with RT, for example the next gen version of spiderman, dying light 2 on pc, the metro last light RT update, cyberpunk 2077 on pc and others. Yes, it's definitely early days, but it's the future of rendering in games without a doubt.I have a 3090 and there's not a single game that's better with ray tracing on. It's a complete novelty, and an unimpressive one, for the moment.
Maybe on the PC where it's still got RT stability/flickering issues, but it's a big noticeable difference for me on the ps5 where it tops out at 60fps anyways. But yeah, early days either way.I have Spiderman. It's crap with raytracing. When you're actually playing, the only perceptible difference is a catered frame rate. You can stop and admire Spiderman reflection on the side of a building. Cool. That was fun for about 3 seconds.
Haven't tried the others. Ray tracing might be the future, but it's just a sales gimmick and a niche feature at the moment.
the cycle frontier at 1440p, anr also used it dowscaled from 4k and it had a similar impact. This was definitely an extreme case... I usually get 10-20 fps more in dlss quality. It's f2p on steam if you wanna try for yourself30-40 fps gained on quality mode? With what game?? And on what resolution?
Sometimes you don't get even close to gain 40 fps on performance mode ffs![]()
Too bad these hardware are not nearly as powerful to guarantee both things, so why wasting time when you have a way lighter solution that looks almost as good and use the hardware for something that doesn't have any alternatives like i don't know, some fucking physics or that nanite shit to improve lod and details?!
The before picture is what you get when you bake realtime lighting. as we have done for decades.Raytraced lighting can be baked and doesn't have to be real-time. this is what they did in Returnal, they recorded the light setting but also touched them up. the cost is shifted to the bandwidth in light files that were 300mb in size.
Can't you only do that if the game is a pc exclusive? like how do you make work a game with ONLY rtx on console?
But that also means you are restricted to having most of your environment and lighting not changing or moving around. The question is which is the driving force - do most games have mainly static environments and therefore pre-baked lighting or do most games have pre-baked lighting and therefore static environments. Destructible environments are something I would love to see more of and RT may be what we need to make it possible.Well yes, raytracing baked into lightmaps is an age old technique used across generations. UE itself has full support for it and always had.
Real time raytracing is useless if most of your environment and lighting aren't gonna move/change around.
The raytraced setting just removes what makes the game attractive.
Returnal also simulated raytracing well.
Can't help but feel raytracing has become a technical obsession rather than an important benchmark for games. games don't need it and it only wastes most of the resource budget.
Nah dude, many people just don't notice bad lights\shadows or don't care, i don't know if it is a brain thing, a perception thing, a sight thing (don't think so since i have 13\10), i can swear on my life that i really don't notice much difference in most games with rtx except some cherry picked moments in some locations.Nanite has an alternative, it's premade Lod's, it looks worse and has obvious drawbacks. Baked lighting is no more a solution than just using the same Lod systems we have for the last decades
Just like using baked lighting looks much worse than raytracing. Imo Baked lighting has looked terrible in every game and I can not wait longer for raytracing to become more prominent. I laugh when people say it looks almost as good when there isn't a single game I can play where I'm not distracted by how fake all the lighting looks every second.
Cinema is literally made with real lights, or in the case of CGI, it's made with raytracingLots of games are trying to convey a specific cinematic look, which would be too difficult to do with realistic lighting.
Raytracing can be customized and used to create fake unrealistic lighting as much as raterized techniques can. There is nothing about raytracing that forces you to be 100% realistic. You could literally use the technique to totally mimic rasterized light.Lots of games deliberately use unrealistic lighting, specifically to achieve an over-the-top feel, or mystery, or whatever else.
Cinema is literally made with real lights, or in the case of CGI, it's made with raytracing
Raytracing can be customized and used to create fake unrealistic lighting as much as raterized techniques can. There is nothing about raytracing that forces you to be 100% realistic. You could literally use the technique to totally mimic rasterized light.
Yeah, Rage used their megatexture tech that allowed them to burn a lot of detail into map textures. I think they dropped it afterwards due to problems such as detail popping in when turning due to lack of i/o bandwidth and the pure complexity of their solution.
Interesting,but was the process automatized?(Yes it was). If so it may be very well worth revisiting
EDIT: as a matter of fact, seems i'm not the only one speculating on a possible comeback for this.
There still problems in terms of storage space and excessive VRAM usage, it might not be as efficient as using ray tracing, though i suppose its circunstancial.current gen I/O is built for that tech. it was ahead of it's time.
This why new consoles have 16gb of GDDR6. Matrix Awakens streams missive textures. On PC, AMD smart access storage streams the assets direct from storage to GPU.There still problems in terms of storage space and excessive VRAM usage, it might not be as efficient as using ray tracing, though i suppose its circunstancial.
Still doesn't seem like a very efficient use (though again, circunstancial). You're basically wasting hardware budget that could be used for real time lightining tech instead of a static one, or massive textures through Nanite + Lumen.This why new consoles have 16gb of GDDR6. Matrix Awakens streams missive textures. On PC, AMD smart access storage streams the assets direct from storage to GPU.
You certainly could. But why change what already works? "Cinematic" wasn't the right word. Maybe "artistic" is better?
It's not just that using raytracing does not force you to be realistic, it just takes as much work to achieve whatever style you can imagine with rasterization as it does with raytracing.
AMID EVIL, which definitely isn't going for realistic looks, has ray tracing, and at least i think it helped with the general look of the game
AMID EVIL, which definitely isn't going for realistic looks, has ray tracing, and at least i think it helped with the general look of the gameAdding ray-tracing into games frequently looks bad because those games weren't made with such a thing in mind.
Developers also don't have much practice with it, yet.
And the hardware still kinda sucks for ray-tracing. Give it a few more generations.
But it's still a bit deeper than that.
Ray-tracing and baked lighting aren't really a 1-1 comparison.
All kinds of 2D games wouldn't even notice a difference by switching to ray-tracing, for example. Or, even worse, real-time lighting might break the game.
Lots of games are trying to convey a specific cinematic look, which would be too difficult to do with realistic lighting.
Lots of games deliberately use unrealistic lighting, specifically to achieve an over-the-top feel, or mystery, or whatever else.
However...
If you want your game to look like the real world, and you want it to react realistically, then ray-tracing is superior 100% of the time.
The only limiting factors, if realism is your goal, are hardware-limitations, money, time, and developer-skill. Luckily, all of those things will improve in a relatively short amount of time.
60FPS > today's standard for ray-tracing