Herr Edgy
Member
Niagara. It's a particle system already in UE4. Just smartly made particles really.So what was controlling the bugs & bats?
also how was this just a empty desert when it was 500 highly detailed statues?
Niagara. It's a particle system already in UE4. Just smartly made particles really.So what was controlling the bugs & bats?
also how was this just a empty desert when it was 500 highly detailed statues?
Another interview with Epic Games.
I remember that clearly, everyone foaming over the tech demos and being underwhelmed by actual gameplay (I was one of them btw). As it turned out the gameplay was a truer indication of what to expect. Of course. I suspect history will repeat.Ha Ha, seriously! They could've just showed that and no other games and everyone would have been super excited with no complaints whatsoever, so why didn't MS do this?
My theory is that they're trying to take the humble sort of Xbox 360 approach. If anyone remembers them showing off initial games like Full Auto, GUN, and other REAL games, while Sony showed off Gundam Crossfire, Killzone and Motorstorm tech demos, actually those weren't even tech demos, but CG trailers, but still, MS and its Xbox 360 had a similar reception where gamers were left wondering what was so next gen about Full Auto, leaving only their Gears of War demo looking anything close to what we were expecting from that current gen, oh and Fight Night Round 3, but that was showcased on the PS3 at the time.
Ha Ha, seriously! They could've just showed that and no other games and everyone would have been super excited with no complaints whatsoever, so why didn't MS do this?
My theory is that they're trying to take the humble sort of Xbox 360 approach. If anyone remembers them showing off initial games like Full Auto, GUN, and other REAL games, while Sony showed off Gundam Crossfire, Killzone and Motorstorm tech demos, actually those weren't even tech demos, but CG trailers, but still, MS and its Xbox 360 had a similar reception where gamers were left wondering what was so next gen about Full Auto, leaving only their Gears of War demo looking anything close to what we were expecting from that current gen, oh and Fight Night Round 3, but that was showcased on the PS3 at the time.
I'm not 100% sure that voxels are on the leaf nodes of the Octree, could be anything there! That's what I'm waiting to hear about. It could be mesh fragments, but they don't tend to subdivide as easily as voxels.And all that without the RayTracing gimmick, finally some voxel rendering of models!
I feel like this is a big of a jump in realism as it was back in the PSX -> PS2 transition!
I'm not 100% sure that voxels are on the leaf nodes of the Octree, could be anything there! That's what I'm waiting to hear about. It could be mesh fragments, but they don't tend to subdivide as easily as voxels.
"You will see some of the experiences built on Unreal running on PS5 as soon as... tomorrow."
!!!
Around 11:05.What's the timestamp?
It could be that the interview was conducted the day before yesterday..."You will see some of the experiences built on Unreal running on PS5 as early as... tomorrow."
!!!
Naively storing a raw mesh I'd say around 500MB. Textures are an interesting question, but presumably we're talking 1:1 with vertex detail(which comes out to 8k texture(s) in this specific case) so probably another 30-60MB on top of that.how far down could you reasonably compress a 33m poly model with textures?
Any system like this has to have a way for user to specify target resolution of your assets (something like amount of data/meter) when you import them. Especially for multiplatform dev, where you'd presumably just play with that lever on export to adjust quality and size of packages for each platform.i'm just trying to see at what point the game package size becomes a large concern with unique assets
Is What Phil Spencer said about the next gen leapAnd all that without the RayTracing gimmick, finally some voxel rendering of models!
I feel like this is a big of a jump in realism as it was back in the PSX -> PS2 transition!
Niagara. It's a particle system already in UE4. Just smartly made particles really.
Is What Phil Spencer said about the next gen leap
I'm sure you've seen worse, but he right about something.This is the most fanboyish comment I've ever seen in my life.
He's not wrong though. Your average PC will not be able to run it. But, provided the PC has enough RAM space and enough CPU power to do decompression, the SSD is not required to be able to run this on a PC.
Hopefully Denis Dyack will dump the Amazon engine and go UE5.
SSD is not going to make the graphics better. It still has to be rendered. It's not an add-on to the GPU.
Let's do some basic math here...
We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.
Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.
If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.
Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.
Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.
Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.
It's the engine that is the 'hero' here. They've completely changed the way of rendering things.
Some of u xbox folks have really lost it after UE5 reveal and I don't understand why ? Xgs will be using this engine as well extensively. All ur damage control effort should be pointed at MS and demad a better reveal for July imo.bunch of random nonsense calculation with assumption after assumptiom wont change that. Sorry but CheersLet's do some basic math here...
We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.
Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.
If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.
Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.
Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.
Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.
It's the engine that is the 'hero' here. They've completely changed the way of rendering things.
Assumption after assumption is not science. SorryDamn, you’re blinding me with science here
That HB2 "in engine" cinematic isn't something that's playable though it's an in engine cinematic it's nowhere close to what gameplay will look like, in engine cinematics are always deceptive and all companies use them and they shouldn't. The UE5 demo was actually playable and captured directly from the dev kit.
Some of u xbox folks have really lost it after UE5 reveal and I don't understand why ? Xgs will be using this engine as well extensively. All ur damage control effort should be pointed at MS and demad a better reveal for July imo.bunch of random nonsense calculation with assumption after assumptiom wont change that. Sorry but Cheers
Assumption after assumption is not science. Sorry
Let's do some basic math here...
We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.
Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.
If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.
Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.
Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.
Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.
It's the engine that is the 'hero' here. They've completely changed the way of rendering things.
What makes you think I'm "Xbox folk", whatever that means? I'm primarily a PC gamer and overclocker.Some of u xbox folks have really lost it after UE5 reveal and I don't understand why ? Xgs will be using this engine as well extensively. All ur damage control effort should be pointed at MS and demad a better reveal for July imo.bunch of random nonsense calculation with assumption after assumptiom wont change that. Sorry but Cheers
The "real game" was an in-engine rendered cutscene at 24fps.Well, HB2 is a real game.
The UE5 gameplay was 1440p, 30 fps, had zero enemies, empty enviroments and even a slow animation to help the game load among other scripted as fuck moments.
These are nice, but this was the PS4/UE4 overlap demo
I have a math problem for you
If PS5 is using the engine to stream in 90MB of raw data each frame in a 60fps game with a 5.5GB/s SSD how will the game run on Xbox SX with 2.4GB/s SSD? will they use half the detail each frame or will they run the game at half the frames per second?
That's developer choice. They can do either one. Or they use the compression and lose less detail while maintaining 60 fps. Again, assuming there's no RAM. It might be as simple as keeping an additional 50MB in RAM for each frame, to achieve the same thing. As long as there's enough RAM available, the same thing could be achieved.I have a math problem for you
If PS5 is using the engine to stream in 90MB of raw data each frame in a 60fps game with a 5.5GB/s SSD how will the game run on Xbox SX with 2.4GB/s SSD? will they use half the detail each frame or will they run the game at half the frames per second?
The "real game" was an in-engine rendered cutscene at 24fps.
But yeah, the UE5 demo was shit. Let's go with that.
I agree. That demo did not take full advantage of the PS5 hardware. We'll see better stuff soon enough.The face of the character was shit, it was 1440p, 30fps, it had slow animations to help the game load, no enemies, pretty empty scenarios...
Loving all ur postsAn in-engine of an actual game you are going to play.
The UE5 wasn’t as impressive as some want to claim. The face of the character was shit, it was 1440p, 30fps, it had slow animations to help the game load, no enemies, pretty empty scenarios...
I’m sure we are going to see better things with the UE5 already.
Loving all ur posts
and then you think that only one game on PS4 was running on unreal engine its Days Gone... will they make now 10-20 AAA games on ps5 with this engine ? not exclusive devs at least. so thats not that good news.
Can you share the link to download it myselfNow, my internet isn't great for watching live streams, so I wasn't able to FULLY appreciate the demo until I downloaded the 4K video and played it straight to my TV, and the quality is beyond the pale great. While it'll take up to two years to see games fully utilize UE5 to this level, I expect first-party titles to at least approach it in the meantime with their own in-house engines. I'm hype.
Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the year and our first real 'taste' of the future of gaming. A true generational leap in terms of sheer density of detail, alongside the complete elimination of LOD pop-in, UE5 adopts a radical approach to processing geometry in combination with advanced global illumination technology. The end result is quite unlike anything we've seen before, but what is the actual nature of the new renderer? How does it deliver this next-gen leap - and are there any drawbacks?
Watching the online reaction to the tech trailer has thrown up some interesting questions but some baffling responses too. The fixation on the main character squeezing through a crevice was particularly puzzling but to make things clear, this is obviously a creative decision, not a means to slow down the character to load in more data - it really is that simple. Meanwhile, the dynamic resolution with a modal 1440p pixel count has also drawn some negative reaction. We have access to 20 uncompressed grabs from the trailer: they defy traditional pixel counting techniques.
Some interesting topics have been raised, however. The 'one triangle per pixel' approach of UE5 was demonstrated with 30fps content, so there are questions about how good 60fps content may look. There have also been some interesting points raised about how the system works with dynamic geometry, as well as transparencies like hair or foliage. Memory management is a hot topic too: a big part of the UE5 story is how original, full fidelity assets can be used unaltered, unoptimised, in-game - so how is this processed? So, to what extent is the Lumen in the Land of Nanite tech demo leveraging that immense 5.5GB/s of uncompressed memory bandwidth?
Core to the innovation in Unreal Engine 5 is the system dubbed Nanite, the micro-polygon renderer that delivers the unprecedented detail seen in the tech demo.
"With Nanite, we don't have to bake normal maps from a high-resolution model to a low-resolution game asset; we can import the high-resolution model directly in the engine. Unreal Engine supports Virtual Texturing, which means we can texture our models with many 8K textures without overloading the GPU." Jerome Platteaux, Epic's special projects art director, told Digital Foundry. He says that each asset has 8K texture for base colour, another 8K texture for metalness/roughness and a final 8K texture for the normal map. So, we end up with eight sets of 8K textures, for a total of 24 8K textures for one statue alone," he adds.
ince detail is tied to pixel amount in screen size, there is no more hard cut-off - no LOD 'popping' as you see in current rendering systems. Likewise, ideally, it should not have that 'boiling' look like you can see with standard displacement as seen in with ground terrain in a game like 2015's Star Wars Battlefront (which still holds up beautifully today, it has to be said).
In lieu of triangle-based hardware-accelerated ray tracing, te UE5 demo on PlayStation 5 utilises screen-space as seen in current generation games to cover small details, which are then combined with a virtualised shadow map.
"Really, the core method here, and the reason there is such a jump in shadow fidelity, is virtual shadow maps. This is basically virtual textures but for shadow maps. Nanite enables a number of things we simply couldn't do before, such as rendering into virtualised shadow maps very efficiently. We pick the resolution of the virtual shadow map for each pixel such that the texels are pixel-sized, so roughly one texel per pixel, and thus razor sharp shadows. This effectively gives us 16K shadow maps for every light in the demo where previously we'd use maybe 2K at most. High resolution is great, but we want physically plausible soft shadows
We were also really curious about exactly how geometry is processed, whether Nanite uses a fully software-based raw compute approach (which would work well across all systems, including PC GPUs that aren't certified with the full DirectX 12 Ultimate) or whether Epic taps into the power of mesh shaders, or primitive shaders as Sony describes them for PlayStation 5. The answer is intriguing.
"The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit," explains Brian Karis. "As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders."
he other fundamental technology that debuts in the Unreal Engine 5 technology demo is Lumen - Epic's answer to one of the holy grails of rendering: real-time dynamic global illumination. Lumen is essentially a non-triangle ray tracing based version of bounced lighting - which basically distributes light around the scene after the first hit of lighting.
"Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware."
To achieve fully dynamic real-time GI, Lumen has a specific hierarchy. "Lumen uses a combination of different techniques to efficiently trace rays," continues Wright. "Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer."
And finally, the smallest details in the scene are traced in screen-space, much like the screen-space global illumination we saw demoed in Gears of War 5 on Xbox Series X. By utilising varying levels of detail for object size and utilising screen-space information for the most complex smaller detail, Lumen saves on GPU time when compared to hardware triangle ray tracing.
Another crucial technique in maintaining performance is through the use of temporal accumulation, where mapping the movement of light bounces occurs over time, from frame to frame to frame.
Alex from DF just put up an Inside Unreal Engine 5 article
Inside Unreal Engine 5: how Epic delivers its generational leap
Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the …www.eurogamer.net
No, it's not. Is making the player move by pressing a button artificial intelligence? Your character even knows how to climb a wall, jump or fit into tight places and all you do is hold the analog stick!So making bugs move using the particle system isn't giving them artificial intelligence?
Have these unreal engine demo videos ever had a game in the upcoming generation ever get close? And as far as gameplay this doesn't even look real playable, more like on rails
The ps5 has 16GB of vram. The 2080ti only has 11GB. Maybe if you mean titan. Otherwise it may even not fit in memory whats needed if it uses over 11GB of vramIf a game on PS5 uses 5gb of v-ram and PC has 20gb v-ram cards, it can store up to 4x the amount of data that the GPU will need,
ps5 has 16GB of vram outside titan most cards have 8 or 11GB of vram.You can just load more assets than PS5.
That flight was quite fast. And the character movement wasnt out of the ordinary.An in-engine of an actual game you are going to play.
The UE5 wasn’t as impressive as some want to claim. The face of the character was shit, it was 1440p, 30fps, it had slow animations to help the game load, no enemies, pretty empty scenarios...
I’m sure we are going to see better things with the UE5 already.
Can you share the link to download it myself