Unreal Engine for Next-Gen Games | Unreal Fest Online 2020

It seems I did a double post about this here, if mods want to delete/merge mine go ahead.

More detailed info on the UE5 engine, the new technologies such as Nanite and Lumen as well as a deeper look at the May Demo. The incredible stuff show here are truly a giant next gen leap it's a great showcases of what the next gen games can look like in about 1-2 years from now.


Most interesting parts of the video:
10:00 - Nanite, Virtualized Geometry, pixel sized polygons. Tech easily runs at 60fps on PS5.
16:40 - Lumen, Realtime baseless Global Illumination. Tech is currently running at 30fps but is aiming or 60fps on PS5.
30:45 - A deeper look at the UE5 Demo.
42:36 - Deeper look at working with Lumen in realtime in UE editor.


Dates: UE5 Early preview 2021, Release date late 2021.
 
Last edited:
Others places have already done a good job explaining Lumen and Nanite to us when they first revealed the demo.
It's much more in depth, showcasing the tech in more detail than before and they talk actual frame rates and textures sizes etc. Perhaps this is to technical for a gaming forum and is more suited as industry developer forums instead but the parts I timestamped to should be easily digestible for those interested.
 
Last edited:
Unreal is good etc.

BUT the real problem is the guy's chair. Wtf he was thinking lol, his shoulders look so distracting haha. Yeah, I know, I'm approaching a "sharp knees" territory , but God damn, I was thrown away lol
 
Others places have already done a good job explaining Lumen and Nanite to us when they first revealed the demo.
This has a lot more than just lumen and nanite.

Most likely it isn't getting traction because some part of it kind of goes against the popular narrative on this forum.
 
So just as I suspected, Nanite uses laughable amount of resources - 4.5ms, so just 1/8th or 1/4th of entire 30/60FPS timeframe budget, and not even 1GB RAM, and it's the Lumen that completely tanks the performance to 30, and is most likely responsible for the resolution going down all the way to 1400p. Good deep dive into the technology, much better than the initial reveal, but still, give me games that actually look like late UE3.5-early UE4 tech demos, then I'll actually get excited, because until then, it's all just a show and no go.
 
So just as I suspected, Nanite uses laughable amount of resources - 4.5ms, so just 1/8th or 1/4th of entire 30/60FPS timeframe budget, and not even 1GB RAM, and it's the Lumen that completely tanks the performance to 30, and is most likely responsible for the resolution going down all the way to 1400p. Good deep dive into the technology, much better than the initial reveal, but still, give me games that actually look like late UE3.5-early UE4 tech demos, then I'll actually get excited, because until then, it's all just a show and no go.

The Elemental demo has been surpassed already.
 
So just as I suspected, Nanite uses laughable amount of resources - 4.5ms, so just 1/8th or 1/4th of entire 30/60FPS timeframe budget, and not even 1GB RAM, and it's the Lumen that completely tanks the performance to 30, and is most likely responsible for the resolution going down all the way to 1400p. Good deep dive into the technology, much better than the initial reveal, but still, give me games that actually look like late UE3.5-early UE4 tech demos, then I'll actually get excited, because until then, it's all just a show and no go.
I would say that it's very likely they'll be able to do Lumen at 60fps on PS5 on release at or close to the demo quality.

Nanite uses laughable amount of resources? Are you talking about "hardware resources" required to do it's thing or the fact that it's able to handle wast amounts of geometry data?
What is of most concern to me is the amount of storage a game might require if they want to release games with assets of this quality. But due to the speed of the new SSDs they don't need to use duplicated data on disk to stream in new sections of the map quickly. This will save a lot of disk space and compression will reduce disk size further.
 
Last edited:
The Elemental demo has been surpassed already.

That castrated PS4 version, yes, but the Samaritan, Infiltrator, Boy and his kite etc. are still out of reach for actual gameplay. Maybe HFW or HB2 will come close, but like I said, I have yet to seen the actual gameplays of those games, not cinematics.
 
What is of most concern to me is the amount of storage a game might require if they want to release games with assets of this quality. But due to the speed of the new SSDs they don't need to use duplicated data on disk to stream in new sections of the map quickly. This will save a lot of disk space and compression will reduce disk size further.

Yeah I'm curious about that one too. At the end of the day the devs will be limited by 100GB the Bluray discs can contain.
 
The Samaritan demo has been surpassed too. The last Batman game equaled that demo.

There is not a single element of graphics in which AK gets even within striking distance of Infiltrator.

The lighting alone costs more than everything else put together in AK.


 
Samaritan was UE3, technically 3.9, the old iteration. Most if not all it's feature are commonly used in this gen: SSR, tessellation, bokeh, area lights, SSSSS (screen space sub surface scattering), volumetric fogs and lights, etc. I think Batman Arkham Knight is pretty similar.

Infiltrator, i feel it is ahead of what current console can do, too much materials, too much shadows, too much particles.
 
Last edited:
There is not a single element of graphics in which AK gets even within striking distance of Infiltrator.

The lighting alone costs more than everything else put together in AK.




Dude.....I said the Samaritan Demo. Not Infiltrator.
 
Really curious to see these billions of polygons running on actual hardware at the moment it is just a taunt
 
Really curious to see these billions of polygons running on actual hardware at the moment it is just a taunt
But it ran on actual PS5 hardware. Also when they say billions of polygons it's a bit misleading. Sure the statue contains 33 million polys and they had a lot of them in the tomb so it might sound like there are billions of polygons on screen but the way Nanite works is that it will only draw one polygon per pixel derived from the status geometry data. At 2560 x 1440 you'll only need 3.686.400 polys on screen at any given time, not billions.
 
But it ran on actual PS5 hardware. Also when they say billions of polygons it's a bit misleading. Sure the statue contains 33 million polys and they had a lot of them in the tomb so it might sound like there are billions of polygons on screen but the way Nanite works is that it will only draw one polygon per pixel derived from the status geometry data. At 2560 x 1440 you'll only need 3.686.400 polys on screen at any given time, not billions.
The fact that runs on actual hardware is yet to be demonstrated, it runs on a demo machine or devkit.

I didn't get the way the technology works, in the wireframe I've seen a lot of Polygons generated and in so does the final image and I like a lot what I see and hope will turn to be doable. For me this is the real next gen so far, because next gen in my opinion has always meant more polygons on screen and more and nicer and smoother 3d assets
 
The fact that runs on actual hardware is yet to be demonstrated, it runs on a demo machine or devkit.

I didn't get the way the technology works, in the wireframe I've seen a lot of Polygons generated and in so does the final image and I like a lot what I see and hope will turn to be doable. For me this is the real next gen so far, because next gen in my opinion has always meant more polygons on screen and more and nicer and smoother 3d assets
Of course it's running on PS5 dev kits but they are as close to final hardware as you'll get. Dev kits usually have a little bit of extra RAM and maybe some slight processing overhead but that is only to be able to do proper monitoring and debugging during development. The performance need to be on par with the consumer version.

How exactly Virtual Geometry Textures used in Nanite works isn't publicly known yet. I've read some of the research papers that early development was drawing upon. If the final version is any thing like those it's a bit like skinning the 3D mesh the way you do with an animal hide and then flatten it into an image, the image then contains the original XYZ coordinates for each particular point of the surface stored in the RGB channels. We now have the 3D mesh stored as a 2D image, from this image we can generate a new mesh at an arbitrary polygon density simply by using a matching mipmap level of that image. Finally, the 3D mesh isn't drawn quite like you normal do in games on the GPU, they use some custom approach in how to rasterize them and I don't fully understand it yet.
 
Last edited:
Top Bottom