The long-awaited Oblivion Remastered patch is finally coming

interesting video , i did not understand it tho , maybe a tech expert on neogaf can summarize


This video provides a detailed and critical analysis of the GPU optimization failures in the Unreal Engine-based game Tees Oblivion Remake, focusing primarily on Nanite rendering technology, Unreal Engine's default rendering behaviors, and Lumen's lighting system. The presenter, supported by a save game provided by a viewer, investigates specific areas in the game where poor performance is evident, using frame-by-frame GPU timing data and in-depth GPU pipeline analysis.

The video begins by highlighting the excessive cost of small, seemingly insignificant renders and explains how these inefficiencies stem from suboptimal default behavior in Unreal Engine, especially concerning fog, anti-aliasing, and prepass rendering. It points out how Unreal's default velocity prepass, which uses pixel shaders unnecessarily, could be optimized by restricting velocity calculations to the base pass, leading to nearly a 30% performance gain.

A major focus is on Nanite's rendering pipeline and its reliance on a software rasterizer for small triangles, which bypasses hardware depth testing (Z testing). This results in massive overdraw and wasted GPU cycles, with Nanite's overdraw being approximately 2900% less efficient than optimized Z-tested rendering. The video refutes advice from Epic Games representatives who downplay the impact of overdraw, emphasizing that overdraw equates to wasted GPU work and directly hurts performance.

The presenter also exposes misunderstandings and conflicting advice from Epic's own Nanite developers, showing that Nanite is slower than traditional rendering for many content types and that its memory savings are often outweighed by performance penalties. The video carefully explains how clustered geometry, small triangles, and software rasterization cause inherent inefficiencies that developers cannot easily fix.

Further, the video delves into detailed analysis of the visibility buffer, showing how Nanite's cluster-based visibility culling leaks and errors cause overdraw that effectively doubles shading work. It also critiques Unreal Engine's use of large, high-resolution textures accessed simultaneously in base passes, which further degrade performance compared to optimized streaming and baking techniques used in other engines.

The video continues by examining Unreal's virtual shadow maps (VSSM) and distance field shadows, pointing out their poor anti-aliasing with noisy results and heavy reliance on Nanite, which compounds performance issues. It also highlights how decals are handled inefficiently in Unreal compared to other engines, requiring a costly full prepass that most AAA games avoid.

When examining Lumen, the video notes that despite some performance improvements in Unreal Engine 5.6, these are largely due to hardware Lumen optimizations that are not representative of what developers typically use (software Lumen). It also calls out Epic's misleading demos that omit certain elements to boost performance artificially. The presenter strongly criticizes Lumen's noisy, flickering global illumination and lack of meaningful performance gains, calling it a visual and performance embarrassment.

Finally, the video stresses the importance of proper optimization tools and workflows, dismissing the notion of "overoptimization" but warning against poorly executed, labor-intensive optimization efforts. It advocates for better developer education and more automated tools to reduce inefficiencies and improve game performance industry-wide. The video concludes with a call to support the channel to continue producing high-quality technical content that drives demand for better graphics and performance in games.

### Highlights
- 🔍 Detailed frame-level GPU timing analysis reveals massive inefficiencies in Unreal Engine's Nanite rendering pipeline.
- ⚠️ Nanite's software rasterizer bypasses hardware Z testing, causing extreme overdraw and wasted GPU cycles.
- 🛑 Unreal Engine's default velocity prepass uses pixel shaders unnecessarily, reducing performance by nearly 30%.
- 🎭 Epic's official Nanite advice is often contradictory and incomplete, misleading developers about its effectiveness.
- 🌿 Overuse of micropolygon foliage and large texture atlases severely degrades performance in Nanite scenes.
- 💡 Lumen's global illumination remains noisy, flickering, and poorly optimized despite Unreal Engine 5.6 updates.
- 🚫 Unreal's decal rendering requires inefficient full prepasses, unlike optimized mesh-based decals in other engines.

### Key Insights
- 🔥 **Nanite's Software Rasterizer is a Critical Bottleneck:** The software rasterizer in Nanite, used for small triangles, disables hardware depth (Z) testing, which is a highly optimized GPU feature designed to reject hidden geometry early. This leads to an enormous increase in pixel overdraw—up to 2900% worse than optimized hardware Z testing—directly harming performance. Developers relying on Nanite must contend with these fundamental inefficiencies that cannot be fully mitigated through content optimization alone.

- 🧩 **Unreal Engine's Default Prepass Behavior is Suboptimal:** Unreal Engine's use of pixel shaders in velocity prepass rendering is unnecessary and costly. By shifting velocity calculations to the base pass, the performance of the prepass can increase by nearly 28.5%. This reveals a systemic inefficiency in the engine's default rendering pipeline, highlighting the importance of understanding and customizing engine defaults rather than blindly accepting them.

- ⚖️ **Misleading Official Guidance Undermines Developer Efforts:** Epic Games' own educators and spokespeople provide contradictory advice on Nanite usage, sometimes encouraging its use on low-poly content despite performance drawbacks. This confusion can mislead developers into adopting workflows that degrade performance rather than improve it. The video underscores the need for clear, consistent, and technically accurate guidance from engine creators.

- 🌐 **Visibility Buffer and Cluster Calling Errors Compound Overdraw:** Nanite's cluster-based visibility system cannot perfectly cull unseen geometry due to cluster bounds errors, causing "leaks" where hidden triangles are still processed. This results in cumulative overdraw equivalent to rendering the visibility buffer twice, which doubles GPU workload unnecessarily. This inherent flaw in Nanite's approach limits how much optimization can help in complex scenes.

- 🎨 **Excessive Texture Sampling Degrades Performance:** The use of multiple large, high-resolution textures in single draw calls—common in the analyzed game—exacerbates GPU load. Unlike engines that bake textures or use efficient texture streaming (MIP mapping), Unreal's approach, especially with Nanite, leads to bloated material evaluation times and increases draw cost significantly.

- 🌫️ **Lumen Global Illumination is Still Impractical for Many Games:** Despite improvements in Unreal Engine 5.6, Lumen's software path—the one most developers use—shows little to no performance gain. Its noisy, flickering lighting artifacts remain problematic. Epic's demos that show improvements rely heavily on hardware Lumen, which is not widely used or supported on target platforms, creating a misleading perception of Lumen's readiness.

- 🛠️ **Optimization Requires Better Tools and Workflow Improvements:** The video stresses that "overoptimization" is a myth; what exists is poorly executed optimization often due to lack of tools or workflow support. Automated or easy-to-use optimization tools are crucial to help developers avoid costly inefficiencies like those seen with Nanite and Unreal's default rendering settings. Industry-wide adoption of these tools would lead to better game performance and visual quality.

### Conclusion
This video exposes deep-rooted performance and rendering issues in Unreal Engine's Nanite and Lumen systems through meticulous technical analysis. It reveals how default engine behaviors and cluster-based rendering approaches lead to massive GPU inefficiencies and overdraw. The presenter challenges official Epic guidance and industry hype, advocating for more realistic developer expectations and better optimization tools. Ultimately, the video encourages the game development community to demand higher standards for engine performance and developer education, highlighting that current state-of-the-art techniques in Unreal Engine still fall short of optimized rendering solutions seen in other engines.
 
This video provides a detailed and critical analysis of the GPU optimization failures in the Unreal Engine-based game Tees Oblivion Remake, focusing primarily on Nanite rendering technology, Unreal Engine's default rendering behaviors, and Lumen's lighting system. The presenter, supported by a save game provided by a viewer, investigates specific areas in the game where poor performance is evident, using frame-by-frame GPU timing data and in-depth GPU pipeline analysis.

The video begins by highlighting the excessive cost of small, seemingly insignificant renders and explains how these inefficiencies stem from suboptimal default behavior in Unreal Engine, especially concerning fog, anti-aliasing, and prepass rendering. It points out how Unreal's default velocity prepass, which uses pixel shaders unnecessarily, could be optimized by restricting velocity calculations to the base pass, leading to nearly a 30% performance gain.

A major focus is on Nanite's rendering pipeline and its reliance on a software rasterizer for small triangles, which bypasses hardware depth testing (Z testing). This results in massive overdraw and wasted GPU cycles, with Nanite's overdraw being approximately 2900% less efficient than optimized Z-tested rendering. The video refutes advice from Epic Games representatives who downplay the impact of overdraw, emphasizing that overdraw equates to wasted GPU work and directly hurts performance.

The presenter also exposes misunderstandings and conflicting advice from Epic's own Nanite developers, showing that Nanite is slower than traditional rendering for many content types and that its memory savings are often outweighed by performance penalties. The video carefully explains how clustered geometry, small triangles, and software rasterization cause inherent inefficiencies that developers cannot easily fix.

Further, the video delves into detailed analysis of the visibility buffer, showing how Nanite's cluster-based visibility culling leaks and errors cause overdraw that effectively doubles shading work. It also critiques Unreal Engine's use of large, high-resolution textures accessed simultaneously in base passes, which further degrade performance compared to optimized streaming and baking techniques used in other engines.

The video continues by examining Unreal's virtual shadow maps (VSSM) and distance field shadows, pointing out their poor anti-aliasing with noisy results and heavy reliance on Nanite, which compounds performance issues. It also highlights how decals are handled inefficiently in Unreal compared to other engines, requiring a costly full prepass that most AAA games avoid.

When examining Lumen, the video notes that despite some performance improvements in Unreal Engine 5.6, these are largely due to hardware Lumen optimizations that are not representative of what developers typically use (software Lumen). It also calls out Epic's misleading demos that omit certain elements to boost performance artificially. The presenter strongly criticizes Lumen's noisy, flickering global illumination and lack of meaningful performance gains, calling it a visual and performance embarrassment.

Finally, the video stresses the importance of proper optimization tools and workflows, dismissing the notion of "overoptimization" but warning against poorly executed, labor-intensive optimization efforts. It advocates for better developer education and more automated tools to reduce inefficiencies and improve game performance industry-wide. The video concludes with a call to support the channel to continue producing high-quality technical content that drives demand for better graphics and performance in games.

### Highlights
- 🔍 Detailed frame-level GPU timing analysis reveals massive inefficiencies in Unreal Engine's Nanite rendering pipeline.
- ⚠️ Nanite's software rasterizer bypasses hardware Z testing, causing extreme overdraw and wasted GPU cycles.
- 🛑 Unreal Engine's default velocity prepass uses pixel shaders unnecessarily, reducing performance by nearly 30%.
- 🎭 Epic's official Nanite advice is often contradictory and incomplete, misleading developers about its effectiveness.
- 🌿 Overuse of micropolygon foliage and large texture atlases severely degrades performance in Nanite scenes.
- 💡 Lumen's global illumination remains noisy, flickering, and poorly optimized despite Unreal Engine 5.6 updates.
- 🚫 Unreal's decal rendering requires inefficient full prepasses, unlike optimized mesh-based decals in other engines.

### Key Insights
- 🔥 **Nanite's Software Rasterizer is a Critical Bottleneck:** The software rasterizer in Nanite, used for small triangles, disables hardware depth (Z) testing, which is a highly optimized GPU feature designed to reject hidden geometry early. This leads to an enormous increase in pixel overdraw—up to 2900% worse than optimized hardware Z testing—directly harming performance. Developers relying on Nanite must contend with these fundamental inefficiencies that cannot be fully mitigated through content optimization alone.

- 🧩 **Unreal Engine's Default Prepass Behavior is Suboptimal:** Unreal Engine's use of pixel shaders in velocity prepass rendering is unnecessary and costly. By shifting velocity calculations to the base pass, the performance of the prepass can increase by nearly 28.5%. This reveals a systemic inefficiency in the engine's default rendering pipeline, highlighting the importance of understanding and customizing engine defaults rather than blindly accepting them.

- ⚖️ **Misleading Official Guidance Undermines Developer Efforts:** Epic Games' own educators and spokespeople provide contradictory advice on Nanite usage, sometimes encouraging its use on low-poly content despite performance drawbacks. This confusion can mislead developers into adopting workflows that degrade performance rather than improve it. The video underscores the need for clear, consistent, and technically accurate guidance from engine creators.

- 🌐 **Visibility Buffer and Cluster Calling Errors Compound Overdraw:** Nanite's cluster-based visibility system cannot perfectly cull unseen geometry due to cluster bounds errors, causing "leaks" where hidden triangles are still processed. This results in cumulative overdraw equivalent to rendering the visibility buffer twice, which doubles GPU workload unnecessarily. This inherent flaw in Nanite's approach limits how much optimization can help in complex scenes.

- 🎨 **Excessive Texture Sampling Degrades Performance:** The use of multiple large, high-resolution textures in single draw calls—common in the analyzed game—exacerbates GPU load. Unlike engines that bake textures or use efficient texture streaming (MIP mapping), Unreal's approach, especially with Nanite, leads to bloated material evaluation times and increases draw cost significantly.

- 🌫️ **Lumen Global Illumination is Still Impractical for Many Games:** Despite improvements in Unreal Engine 5.6, Lumen's software path—the one most developers use—shows little to no performance gain. Its noisy, flickering lighting artifacts remain problematic. Epic's demos that show improvements rely heavily on hardware Lumen, which is not widely used or supported on target platforms, creating a misleading perception of Lumen's readiness.

- 🛠️ **Optimization Requires Better Tools and Workflow Improvements:** The video stresses that "overoptimization" is a myth; what exists is poorly executed optimization often due to lack of tools or workflow support. Automated or easy-to-use optimization tools are crucial to help developers avoid costly inefficiencies like those seen with Nanite and Unreal's default rendering settings. Industry-wide adoption of these tools would lead to better game performance and visual quality.

### Conclusion
This video exposes deep-rooted performance and rendering issues in Unreal Engine's Nanite and Lumen systems through meticulous technical analysis. It reveals how default engine behaviors and cluster-based rendering approaches lead to massive GPU inefficiencies and overdraw. The presenter challenges official Epic guidance and industry hype, advocating for more realistic developer expectations and better optimization tools. Ultimately, the video encourages the game development community to demand higher standards for engine performance and developer education, highlighting that current state-of-the-art techniques in Unreal Engine still fall short of optimized rendering solutions seen in other engines.

Wow! Thanks to A.I. here.
 
Nice they are patching it , but I'm still completely stuck on a quest line I can not complete , that stupid who stole the painting "canvas the castle". I shouldn't be surprised why I never finished the game when it first came out.
 
Wow! Thanks to A.I. here.
pretty-cool-cerny.gif
 
Excellent work. Now, why didn't they just wait until all this shit was fixed until releasing the game in the state it was released in? I get it- big boss Microsoft demands content for their sub service but for the others of us who purchased, we get to be beta testers for you greedy cunts...?
 
[
This video provides a detailed and critical analysis of the GPU optimization failures in the Unreal Engine-based game Tees Oblivion Remake, focusing primarily on Nanite rendering technology, Unreal Engine's default rendering behaviors, and Lumen's lighting system. The presenter, supported by a save game provided by a viewer, investigates specific areas in the game where poor performance is evident, using frame-by-frame GPU timing data and in-depth GPU pipeline analysis.

The video begins by highlighting the excessive cost of small, seemingly insignificant renders and explains how these inefficiencies stem from suboptimal default behavior in Unreal Engine, especially concerning fog, anti-aliasing, and prepass rendering. It points out how Unreal's default velocity prepass, which uses pixel shaders unnecessarily, could be optimized by restricting velocity calculations to the base pass, leading to nearly a 30% performance gain.

A major focus is on Nanite's rendering pipeline and its reliance on a software rasterizer for small triangles, which bypasses hardware depth testing (Z testing). This results in massive overdraw and wasted GPU cycles, with Nanite's overdraw being approximately 2900% less efficient than optimized Z-tested rendering. The video refutes advice from Epic Games representatives who downplay the impact of overdraw, emphasizing that overdraw equates to wasted GPU work and directly hurts performance.

The presenter also exposes misunderstandings and conflicting advice from Epic's own Nanite developers, showing that Nanite is slower than traditional rendering for many content types and that its memory savings are often outweighed by performance penalties. The video carefully explains how clustered geometry, small triangles, and software rasterization cause inherent inefficiencies that developers cannot easily fix.

Further, the video delves into detailed analysis of the visibility buffer, showing how Nanite's cluster-based visibility culling leaks and errors cause overdraw that effectively doubles shading work. It also critiques Unreal Engine's use of large, high-resolution textures accessed simultaneously in base passes, which further degrade performance compared to optimized streaming and baking techniques used in other engines.

The video continues by examining Unreal's virtual shadow maps (VSSM) and distance field shadows, pointing out their poor anti-aliasing with noisy results and heavy reliance on Nanite, which compounds performance issues. It also highlights how decals are handled inefficiently in Unreal compared to other engines, requiring a costly full prepass that most AAA games avoid.

When examining Lumen, the video notes that despite some performance improvements in Unreal Engine 5.6, these are largely due to hardware Lumen optimizations that are not representative of what developers typically use (software Lumen). It also calls out Epic's misleading demos that omit certain elements to boost performance artificially. The presenter strongly criticizes Lumen's noisy, flickering global illumination and lack of meaningful performance gains, calling it a visual and performance embarrassment.

Finally, the video stresses the importance of proper optimization tools and workflows, dismissing the notion of "overoptimization" but warning against poorly executed, labor-intensive optimization efforts. It advocates for better developer education and more automated tools to reduce inefficiencies and improve game performance industry-wide. The video concludes with a call to support the channel to continue producing high-quality technical content that drives demand for better graphics and performance in games.

### Highlights
- 🔍 Detailed frame-level GPU timing analysis reveals massive inefficiencies in Unreal Engine's Nanite rendering pipeline.
- ⚠️ Nanite's software rasterizer bypasses hardware Z testing, causing extreme overdraw and wasted GPU cycles.
- 🛑 Unreal Engine's default velocity prepass uses pixel shaders unnecessarily, reducing performance by nearly 30%.
- 🎭 Epic's official Nanite advice is often contradictory and incomplete, misleading developers about its effectiveness.
- 🌿 Overuse of micropolygon foliage and large texture atlases severely degrades performance in Nanite scenes.
- 💡 Lumen's global illumination remains noisy, flickering, and poorly optimized despite Unreal Engine 5.6 updates.
- 🚫 Unreal's decal rendering requires inefficient full prepasses, unlike optimized mesh-based decals in other engines.

### Key Insights
- 🔥 **Nanite's Software Rasterizer is a Critical Bottleneck:** The software rasterizer in Nanite, used for small triangles, disables hardware depth (Z) testing, which is a highly optimized GPU feature designed to reject hidden geometry early. This leads to an enormous increase in pixel overdraw—up to 2900% worse than optimized hardware Z testing—directly harming performance. Developers relying on Nanite must contend with these fundamental inefficiencies that cannot be fully mitigated through content optimization alone.

- 🧩 **Unreal Engine's Default Prepass Behavior is Suboptimal:** Unreal Engine's use of pixel shaders in velocity prepass rendering is unnecessary and costly. By shifting velocity calculations to the base pass, the performance of the prepass can increase by nearly 28.5%. This reveals a systemic inefficiency in the engine's default rendering pipeline, highlighting the importance of understanding and customizing engine defaults rather than blindly accepting them.

- ⚖️ **Misleading Official Guidance Undermines Developer Efforts:** Epic Games' own educators and spokespeople provide contradictory advice on Nanite usage, sometimes encouraging its use on low-poly content despite performance drawbacks. This confusion can mislead developers into adopting workflows that degrade performance rather than improve it. The video underscores the need for clear, consistent, and technically accurate guidance from engine creators.

- 🌐 **Visibility Buffer and Cluster Calling Errors Compound Overdraw:** Nanite's cluster-based visibility system cannot perfectly cull unseen geometry due to cluster bounds errors, causing "leaks" where hidden triangles are still processed. This results in cumulative overdraw equivalent to rendering the visibility buffer twice, which doubles GPU workload unnecessarily. This inherent flaw in Nanite's approach limits how much optimization can help in complex scenes.

- 🎨 **Excessive Texture Sampling Degrades Performance:** The use of multiple large, high-resolution textures in single draw calls—common in the analyzed game—exacerbates GPU load. Unlike engines that bake textures or use efficient texture streaming (MIP mapping), Unreal's approach, especially with Nanite, leads to bloated material evaluation times and increases draw cost significantly.

- 🌫️ **Lumen Global Illumination is Still Impractical for Many Games:** Despite improvements in Unreal Engine 5.6, Lumen's software path—the one most developers use—shows little to no performance gain. Its noisy, flickering lighting artifacts remain problematic. Epic's demos that show improvements rely heavily on hardware Lumen, which is not widely used or supported on target platforms, creating a misleading perception of Lumen's readiness.

- 🛠️ **Optimization Requires Better Tools and Workflow Improvements:** The video stresses that "overoptimization" is a myth; what exists is poorly executed optimization often due to lack of tools or workflow support. Automated or easy-to-use optimization tools are crucial to help developers avoid costly inefficiencies like those seen with Nanite and Unreal's default rendering settings. Industry-wide adoption of these tools would lead to better game performance and visual quality.

### Conclusion
This video exposes deep-rooted performance and rendering issues in Unreal Engine's Nanite and Lumen systems through meticulous technical analysis. It reveals how default engine behaviors and cluster-based rendering approaches lead to massive GPU inefficiencies and overdraw. The presenter challenges official Epic guidance and industry hype, advocating for more realistic developer expectations and better optimization tools. Ultimately, the video encourages the game development community to demand higher standards for engine performance and developer education, highlighting that current state-of-the-art techniques in Unreal Engine still fall short of optimized rendering solutions seen in other engines.
His Nanite criticisms have some merit but his complete lack of pointing out the benefits is telling. That he is also doesn't focus on how much the cost of hardware lumen has been reduced goes to show he simply has an axe to grind. "Most developers use software lumen", that because hardware lumen has been too expensive on consoles, of course performance benefits to hardware lumen are a boon and would allow for more usage thereof.

I'm not sure who is the biggest whiner these days, him or Vex.
 
His Nanite criticisms have some merit but his complete lack of pointing out the benefits is telling...
I tend to agree. I watched one of his older videos where he broke down Nanite's overdraw issues, which was actually very good, but he's slanted far too negative for clicks and headlines now, and outright ignores the very real of trade offs developers have to make to actually ship a game.

I'll say this: if and/or when he finally presents his game, it'll need to have Carmack-levels of wizardry to justify his rage-baiting.
 
Top Bottom