Might be talking out of my well-toned sphincter, but as I understand it the tech actually anticipates the shift in advance.Someone here once mentioned that Smartshift can execute in about 2ms.
That's actually quite a lot. If you're targeting 60fps, you need to account for that 2ms drop, reducing your frame budget to 14ms. If it's switching a lot, you're going to notice the hit.
But, I'm not sure how it works. I guess the developers has much more control over it than the system trying to intelligently guess how to shuffle power. It would be too unpredictable otherwise?
Someone here once mentioned that Smartshift can execute in about 2ms.
That's actually quite a lot. If you're targeting 60fps, you need to account for that 2ms drop, reducing your frame budget to 14ms. If it's switching a lot, you're going to notice the hit.
But, I'm not sure how it works. I guess the developers has much more control over it than the system trying to intelligently guess how to shuffle power. It would be too unpredictable otherwise?
Might be talking out of my well-toned sphincter, but as I understand it the tech actually anticipates the shift in advance.
What is great, you can redirect geometry rendering budget to other stuff.To be precise it's only about geometry rendering. There are other intensive tasks beside that like Lumen, volumetric effects, particles etc.
Sounds like Zootopia meets Terminator
That video from MGB "PS5 games look so good they don't need ray tracing?" Is one of his stupidest ones and won't age well at all.
I mean. Technically of they all adopt down form of Global illumination, then maybe. But even that is Faux ray tracing tbh
That's great to hear. Can't wait to see what Ninja Theory is able to do with this and make it run at 4K nativeApparently that UE5 demo is not GPU intensive, abuot same as fortnite at 60 FPS acording to EPIC.
Looks like its all about latency, doing things fast, and IO
![]()
That amazing UE5 tech demo was only as GPU-intensive as Fortnite
And this is seriously good news for the future of gaming performancewww.pcgamesn.com
But what about terrafloppies......?
Im aware, I say faux because it achieves the same goals but with different results than Traditional HW dedicated RT. (I also call prim shaders fuax mesh shading for the same reason).Software RT is amazing and often less taxing on the GPU. But I wouldn't count out HW RT completely. GI has been around for iirc almost a decade, not new but never touted in the gaming world until the UE5 demo showed how damn clean it can lookHmm spidey sense is tingling.
You can have software raytracing & hardware. GI is ray tracing. If you just google GI vs Raytracing. The below turns up. Into some PDF from a university.
"Global illumination," the more advanced form of ray tracing, adds to the local model by reflecting light from surrounding surfaces to the object. A global illumination model is more comprehensive, more physically correct, and it produces more realistic images.
That's great to hear. Can't wait to see what Ninja Theory is able to do with this and make it run at 4K native
That's actually not surprising. Makes sense, why epic refused to release the demo specs and minimum requirement the entire month after it's show. I knew this demo was Very achievable on comparable hardware.Apparently that UE5 demo is not GPU intensive, abuot same as fortnite at 60 FPS acording to EPIC.
Looks like its all about latency, doing things fast, and IO
![]()
That amazing UE5 tech demo was only as GPU-intensive as Fortnite
And this is seriously good news for the future of gaming performancewww.pcgamesn.com
But what about terrafloppies......?
UE5 is literally UE4 with updated plugins. The engine is built on UE4. The entire magic behind nanite (the technology not the name of the land in the demo) builds on the same technology you find in Quixel megascans. Infact Quixel is free when purchase UE as a student if you develope now, even on current gen GDKs.Do you think Hellblade will be using nantite and lumen or standard UE5 upgraded from UE4.
They are not the same thing lol
The UE5 Nanite demo is not about TF, it needs low latency and fast SSD, fast fast and fast again. TF wont help much for nanite is what the message is here.
That's actually not surprising. Makes sense, why epic refused to release the demo specs and minimum requirement the entire month after it's show. I knew this demo was Very achievable on comparable hardware.
How do you mean what's the message here? The message is that Ninja Theory is a very skillful team with the money of Microsoft behind them now, who are planning to use UE5.Do you think Hellblade will be using nantite and lumen or standard UE5 upgraded from UE4.
They are not the same thing lol
The UE5 Nanite demo is not about TF, it needs low latency and fast SSD, fast fast and fast again. TF wont help much for nanite is what the message is here.
If it's not GPU intensive, then why was it running at 1440p rather than 4K?Apparently that UE5 demo is not GPU intensive, abuot same as fortnite at 60 FPS acording to EPIC.
Looks like its all about latency, doing things fast, and IO
![]()
That amazing UE5 tech demo was only as GPU-intensive as Fortnite
And this is seriously good news for the future of gaming performancewww.pcgamesn.com
But what about terrafloppies......?
UE5 is literally UE4 with updated plugins. The engine is built on UE4. The entire magic behind nanite (the technology not the name of the land in the demo) builds on the same technology you find in Quixel megascans. Infact Quixel is free when purchase UE as a student if you develope now, even on current gen GDKs.
High LOD and GI are amazing but have been around as a technology for nearly 10 yrs at this point. Now there is Much More to UE5 than GI and Nanite but that's all you'll hear people parrot around Gaf because they haven't used it and the idea of GI was foreign here before the tech demo
(This is all running on a 6 yr old software engine using the technology behind Lumin)
If it's not GPU intensive, then why was it running at 1440p rather than 4K?
How could you tell it was 1440p if you weren't told?If it's not GPU intensive, then why was it running at 1440p rather than 4K?
Try again, its not about TF, its about IO and removing bottlenecks, how many times do we have to keep posting teh same stuff.
A game about fish WOOOOOOOOOOOOOOOOO I CAN DREAM>
And that bloody Dolphin game is not a fish, if your thinking about linking that.
DF pixel counting i guessHow could you tell it was 1440p if you weren't told?
![]()
Inside Unreal Engine 5: how Epic delivers its generational leap
Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the …www.eurogamer.net
Are we keeping this same energy when a next gen AAA 1st party title does the same thoughHow could you tell it was 1440p if you weren't told?
![]()
Inside Unreal Engine 5: how Epic delivers its generational leap
Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the …www.eurogamer.net
Congratulations?
You've mentioned Flops 3 sepperate times on this page of the thread alone. I've mentioned Flops exactly zero times. Are you arguing with yourself? What can be found in my post that had Anything in the world to do with the price of rice in Taiwan, let alone Tflops?
If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps then it's even more impressive because it looks amazing and doesn't tax the GPU....as expected. Cheaper crisper textures( from 1440-full8K) is the name of the game. LOD is the name of the game. RT alternatives is the name of the game, more bang for less processing buck. The engine was designed with Mobile devices in mind, why would they design it to be GPU heavy?
Also it seems like every response you've posted again, attempts to imply that the PS5 is the standard hardware needed to run the engine...it's not. Please get help. Fast I/0 writes are the standard of the next gen.
...can you guess which consoles with feature SSD's 6 xs as fast as current gen SATA SSD's?..... every next gen console!(Lol even the rumored Lockhart).
No one is disputing what epic pulled off, you should though at some point sit your self down and have that hard honest conversation with yourself: Sony does not own. UE5![]()
If it's not GPU intensive, then why was it running at 1440p rather than 4K?
Not a single ad homie in sight. It's cool to just ignore the post and not respond. Strawmen arguments are the Go to here when there's nothing else to be said I guessLess the ad hominem and personal insults, grow up.
A game about fish WOOOOOOOOOOOOOOOOO I CAN DREAM>
And that bloody Dolphin game is not a fish, if your thinking about linking that.
With the Nvidia RTX 2060 Super, meanwhile, you might expect Nvidia's proprietary DLSS standard to be your preferred option to get up to 4K resolution at 60fps. Yet astoundingly, AMD's FidelityFX CAS, which is platform agnostic, wins out against the DLSS "quality" setting.
Both of these systems generally require serious squinting to make out their rendering lapses, and both apply a welcome twist on standard temporal anti-aliasing (TAA) to the image, meaning they're not only adding more pixels to a lower base resolution but also smoothing them out in mostly organic ways. But FidelityFX CAS preserves a slight bit more detail in the game's particle and rain systems, which ranges from a shoulder-shrug of, "yeah, AMD is a little better" most of the time to a head-nod of, "okay, AMD wins this round" in rare moments. AMD's lead is most evident during cut scenes, when dramatic zooms on pained characters like Sam "Porter" Bridges are combined with dripping, watery effects. Mysterious, invisible hands leave prints on the sand with small puddles of black water in their wake, while mysterious entities appear with zany swarms of particles all over their frames
.
Peace between feline and fish be upon you!
That's not the point. The point is that it was indeed running at 1440p, which implies that it was GPU intensive. Otherwise, it would have been running at 4K or at least at 1440p but at 60 frames per second. Having said that, I'm not dissing the PS5, because I cannot wait to get one.How could you tell it was 1440p if you weren't told?
![]()
Inside Unreal Engine 5: how Epic delivers its generational leap
Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the …www.eurogamer.net
Congratulations?
You've mentioned Flops 3 sepperate times on this page of the thread alone. I've mentioned Flops exactly zero times. Are you arguing with yourself? What can be found in my post that had Anything in the world to do with the price of rice in Taiwan, let alone Tflops?
If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps then it's even more impressive because it looks amazing and doesn't tax the GPU....as expected. Cheaper crisper textures( from 1440-full8K) is the name of the game. LOD is the name of the game. RT alternatives is the name of the game, more bang for less processing buck. The engine was designed with Mobile devices in mind, why would they design it to be GPU heavy?
Also it seems like every response you've posted again, attempts to imply that the PS5 is the standard hardware needed to run the engine...it's not. Please get help. Fast I/0 writes are the standard of the next gen.
...can you guess which consoles feature fast I/O speeds SSD's 6 xs as fast as current gen SATA SSD's?..... every next gen console!(Lol even the rumored Lockhart).
No one is disputing what epic pulled off, you should though at some point sit your self down and have that hard honest conversation with yourself: Sony does not own. UE5![]()
That's not the point. The point is that it was indeed running at 1440p, which implies that it was GPU intensive. Otherwise, it would have been running at 4K or at least at 1440p but at 60 frames per second. Having said that, I'm not dissing the PS5, because I cannot wait to get one.
And here is the caveat, the TIME spent rendering on ps5 was same as fortnite on current gen consoles, lolol
So funny, back to the drawing board lololol
This "If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps" is not this If it's true that running the UE5 demo only took the "geometry rendering budget" of Fortnite at 60fps on console.
That's one thing that makes UE5 so exciting. If it's true that running the UE5 demo only took the "geometry rendering budget" of Fortnite at 60fps on console, we might be in store for a game engine that really gives lower-end hardware something to work with. Sure, geometry rendering isn't the be-all and end-all of GPU gruntwork, and the GPU isn't the only component relevant to gaming performance, but geometry rendering certainly takes up a fair chunk of the graphics pipeline, and the GPU is the single most important component when it comes to gaming performance.
Geometry rendering budget NOT EQUAL same amount of assets needed to run Fortnite at 60 fps. Is a diferent concept.
Greetings
That's not the point. The point is that it was indeed running at 1440p, which implies that it was GPU intensive. Otherwise, it would have been running at 4K or at least at 1440p but at 60 frames per second. Having said that, I'm not dissing the PS5, because I cannot wait to get one.
Correct, I speed skimmed that post because much of it was burried in FUD(the original post I was responding to). But you're correct, and either way we slice it, that's impressive. We need more engines to follow the template of lost cost/ high detail. My post reiterates that this is 1st, a multiplat engine and we'll all benefit from from Everything nanite and lumen have to offer regardless of platform, period. If someone wants to argue against that, then they can't be helped imo
I missed you the other day......That was the other day.
A game about fish WOOOOOOOOOOOOOOOOO I CAN DREAM>
And that bloody Dolphin game is not a fish, if your thinking about linking that.
I so badly want to flame you for saying that but that sad thing is your right. FFS.As poorly as that dolphin is modeled I could forgive someone for thinking it's a fish, it doesn't look right at all lol.
![]()
![]()
NeoGAF July Gif and Meme Contest *Winner Gets Gold*
Welcome to the NeoGAF July Gif and Meme Contest To see last months contest, click HERE!!! @SlimySnake was last months winner with Now @SlimySnake must defend the title of Gif and Meme Champion. Rules 1. The Gif/Meme you submit must be your own creation. NO DIGGING UP OLD GIFS (KAZ GIFS)...www.neogaf.com
Was not, they say in that time the reconstruction technique was good enough that they got 4k from the image of the demo,DF pixel counting i guess
UE5 is literally UE4 with updated plugins. The engine is built on UE4. The entire magic behind nanite (the technology not the name of the land in the demo) builds on the same technology you find in Quixel megascans. Infact Quixel is free when purchase UE as a student if you develope now, even on current gen GDKs.
High LOD and GI are amazing but have been around as a technology for nearly 10 yrs at this point. Now there is Much More to UE5 than GI and Nanite but that's all you'll hear people parrot around Gaf because they haven't used it and the idea of GI was foreign here before the tech demo
(This is all running on a 6 yr old software engine using the technology behind Lumin)