Unreal Engine is slowly causing the decay of the industry, what can be done?

I feel that's the main point. Most of these studios have hired a bunch of low talent goobers and thrown this engine at them, hence the results.
I'd say its the other way around, low talent goobers learn nothing but how to use unreal/unity and studios have no choice but to use it since those people are all they can find.
 
I feel that's the main point. Most of these studios have hired a bunch of low talent goobers and thrown this engine at them, hence the results.
It's not really about hiring
Accesible stardatized engines allowed creators to create games even though they are not technically savvy enough for it - there are like 1-2 order of magnitude number of games released compared to what it was before mass-market engine came (most of them though a slop). And it scaled to all levels of development where money were redistributed to creativity/presentation part and technical stuff was put completely on UE/Unity part with consequential results of dropping average technical performance.
It's not like companies can't find better engineers - but this will require to cut down artist, game designers etc - means games will degrade and become simplier and uglier. And a lot of games will just die, especially in indies space, even some with stellar ideas, if common engine would have been banned.
 
Your time is more precious than your money but you won't invest in a half decent PC? Okay lol.
Yeah pretty much. I like console gaming because I can just focus on playing something. If I wanna play Crusader Kings I'll do it on my PC.

But I am over the days where I see if I get 5 extra FPS on medium settings over high settings - I just don't give a shit about that stuff anymore.
 
The reason a lot of studios use UE is because they don't have to pay a whole team to maintain and develop their engine.
It's a simple math of what's your core business differentiator yea.
(Un)fortunately for games, techstack plays a significant role there no matter what. Ie. If everything is standardised you have more standout opportunities if you step out of line. This is why many studio particularly last gen built their core tech ip around the engine despite being UE/Unity houses.

They also save money on training new employees. Someone coming from another studio using UE to your studio using UE can get going right away.
This, however makes very little difference in practice (observance of 20+ years first hand and dozens of different engines, UE/Unity included). It's a nice little tale to tell execs when paying license fees but that's really all it amounts to for the large part.
IE. Competent ppl are competent, the rest aren't, no matter how they seem to match the tools etc.

And it's all the more evident by results most games get, if this actually meant something in practice both UE/Unity would have competitive moats multiple times in past 16 years that never materialised for either.
 
This, however makes very little difference in practice (observance of 20+ years first hand and dozens of different engines, UE/Unity included). It's a nice little tale to tell execs when paying license fees but that's really all it amounts to for the large part.
IE. Competent ppl are competent, the rest aren't, no matter how they seem to match the tools etc.
Interesting. What you're saying(based on what you've observed) is a competent developer coming from Dice and Frostbite, for example, to work at id Software using id Tech 7 would not have as big a learning curve or time to contribution as I was suggesting?
 
Small studios losing 5% of gross revenue in perpetuity is a real problem.
So how is EPIC supposed to make money and development on a brand new engine? They are giving you a platform to use, you dont need to spend $2000 upfront or however much to buy the product. You get a free engine, you just have to give them 5% if you are planning to sell your game. I dont see anything wrong with that business practice. Steam takes 30% on sales until they reach 10 mil, then it drops down to 25%.
 
What is it about UE5 that stops devs from making true S Tier Classics? Maybe I'm overlooking one, but I don't think any other engine has as many releases without a 10/10. Why is that?
The most probable goty of this year is made with ue5 and it's one of the best game of the past years, who gives a fuck if it doesn't score a 10?
 
ITT ignorant Gaffers who don't even have a concept of what a gaming engine is mouth off about things they're fundamentally ignorant of.

Thank God for the few voices in here from actual educated devs that can cut through the chaff.

Dunning-Kruger is one hell of a drug....
 
Small studios losing 5% of gross revenue in perpetuity is a real problem.
Small studios didn't have to spend tens to hundreds of thousands in license fees up front for the engine or spend just as much developing their own either.

5% gross is just one license option. Studios still have the ability to negotiate flare rate license deals.
 
The most probable goty of this year is made with ue5 and it's one of the best game of the past years, who gives a fuck if it doesn't score a 10?
No I can see 33 being a 10/10 if you're into turn based. I played enough to tell it was really high quality if it's a genre you like. Not for me, but yeah, it is probably good enough to retire that line. Still holds for the big budget AAA stuff though.
 
Actually refreshing to see more people defend my work here
Although my work is not really true, I don't do anything that affects graphics or performance lmao
 
Have you ever developed a game with that engine or any game at all?
If no, then why do you believe you have the expertise for a valid judgement?
On one hand that seems like a reasonable stance to LakeOf9's comment, but on the other hand...

anyone that has even the smallest - play around - experience with UE5 knows that you could hand UE5 to expert developers that were novices in UE and without becoming experts in UE5 what they would bake by default would be be a very bloated representation of their game idea because advanced systems are enabled by default and the blueprint system no longer has a compile to native C/C++ code option.

All game logic and branching by default is effectively (at best) the rate of a Java virtual machine interpreter (compared to C/C++) unless developers manually go native for critical code, pushing their knowledge of UE5 much, much higher than the easy win middleware engine it is supposed to be, over writing their own lean-and-mean game engine and renderer.

By default the larger UE community are less technical indies that don't have that expertise and will walk straight into those performance issues, so the criticism is probably fair that UE5 doesn't cater to deliver fast game logic for the vast majority of its devs out of the box, even if they are prepared to turn off advanced lighting.
 
Interesting. What you're saying(based on what you've observed) is a competent developer coming from Dice and Frostbite, for example, to work at id Software using id Tech 7 would not have as big a learning curve or time to contribution as I was suggesting?
Absolutely. Specialisation has long been overvalued in software engineering and big tech has figured that out decades ago, unifying engineering ladders in the most successful companies.(though granted this has been regressing in the AI gold rush but noone has any idea what they're doing there yet).
It's true Game companies have always lagged behind there though.

For other disciplines that's even more obvious, content, design, art etc. are disciplines where core competencies are far more important than any tool competency.

And I've personally witnessed the best ppl go from 0 to highly competent in 3 months, coming in with absolutely no game experience whatsoever, not just "wrong engine".
Reverse also happens a lot, games had been suffering massive "brain drain" on account of ppl leaving for tech jobs that pay much better with much less stress to boot.
 
On one hand that seems like a reasonable stance to LakeOf9's comment, but on the other hand...
anyone that has even the smallest - play around - experience with UE5 knows that you could hand UE5 to expert developers that were novices in UE and without becoming experts in UE5 what they would bake by default would be be a very bloated representation of their game idea because advanced systems are enabled by default and the blueprint system no longer has a compile to native C/C++ code option.
I dont think that option was removed, it can still be done in UE5.6

All game logic and branching by default is effectively (at best) the rate of a Java virtual machine interpreter (compared to C/C++) unless developers manually go native for critical code, pushing their knowledge of UE5 much, much higher than the easy win middleware engine it is supposed to be,
But it is the easy win middleware engine if you have some skills!

over writing their own lean-and-mean game engine and renderer.
Well that heavily depends on the type of game you are making, but for most indies it´s for 2D games GameMaker, for 2D and lowend 3D games Unity and for highend 3D Unreal or CryEngine.
So unless you want to do something very specific that those engines cant do, it would be quite uneconomic to make your own engine.
I worked with such a very speicifc engine before, and the only reason it was made was because it had to run on NeoGeo hardware.

By default the larger UE community are less technical indies that don't have that expertise and will walk straight into those performance issues, so the criticism is probably fair that UE5 doesn't cater to deliver fast game logic for the vast majority of its devs out of the box, even if they are prepared to turn off advanced lighting.
That some indie devs dont want to spend the time learning it properly is hardly the fault of the engine. But to be fair, the documentation for it could be better!
Those people would probably be better off going with Unity as there´s tons of premade plugins that make things easy.
 
Small studios didn't have to spend tens to hundreds of thousands in license fees up front for the engine or spend just as much developing their own either.

5% gross is just one license option. Studios still have the ability to negotiate flare rate license deals.
100%!

BTW, UE3's license fees were $250-300k.

UE4 went up to $1.5-2 million iirc.

Switching to royalties helps smaller devs who cant put up that kind of cash up front. Expedition 33 devs have sold 4.4 million copies at just $45 a pop. giving Epic 10-12% of that is nothing. Epic also doesnt take anything from the first million dollars of profit. or maybe it was revenue.
 
Last edited:
"Decay" for entire gaming industry over graphic engine?

Hyperbolic much?
You missed the part about posting on a gaming forum? Hyperbole is the only thing we do here.
That said UE has been the driving force behind a fair number of negative industry trends in the last 25 years. Not the least of it being the HD gen look, aka the brown and blurry.
But industry decay is much more of an industry fault than any tech stack, influential or not.
 
You missed the part about posting on a gaming forum? Hyperbole is the only thing we do here.
That said UE has been the driving force behind a fair number of negative industry trends in the last 25 years. Not the least of it being the HD gen look, aka the brown and blurry.
UE engine was way worse in PS3/360 era when it comes to performance, in fact I would even go as far as say UE in Ps3/360 was ugly.
But industry decay is much more of an industry fault than any tech stack, influential or not.
To me gaming industry has its issue but I dont believe its "decaying" because we still amazing games.
 
UE is amazing and one of the best things to happen to the game industry. Sure competition is good, well then, make something better. Unity failed, so someone make something more intuitive and better.

I use UE every day. The problems with the game industry are not the engine, no one just uses the engine as is unless you are an asset flipper /porn game maker. Everyone makes their own modified version, my company makes modified versions for every game it makes. The problem with the industry are the shit bottlenecks from the last 15 years....the newest being the Switch 2. You have no idea how terrible a system like that warps development even when you are not initially developing for it.
 
I dont think that option was removed, it can still be done in UE5.6
It was removed from UE5 and has only just returned in 5.6, but this performance topic is largely about the last 3-5years in which the feature was removed and many games have unreliable and jitter performance on any hardware, rather than just lower performance like the PS3/360 gen
But it is the easy win middleware engine if you have some skills!
But that isn't "some skills" IMHO, if you have to write UE interfacing native code to replace Blueprints. That is highly advanced Blueprint profiling and replacing default engine features - which IMO shouldn't be required for an easy win.
Well that heavily depends on the type of game you are making, but for most indies it´s for 2D games GameMaker, for 2D and lowend 3D games Unity and for highend 3D Unreal or CryEngine.
So unless you want to do something very specific that those engines cant do, it would be quite uneconomic to make your own engine.
I worked with such a very speicifc engine before, and the only reason it was made was because it had to run on NeoGeo hardware.
But this thread topic doesn't originate for 2D games, it is driven by 3D games, so out of the box performance for indie 3D games and even less specialist UE dev at bigger studios is the yardstick, no? Building one engine per 2D game is likely to be far more economical - unless targeting mobile -and yield far more time on design/gameplay ideas than learning how to wrestle and work around systems that are irrelevant or bloated overkill in a 2D game.
That some indie devs dont want to spend the time learning it properly is hardly the fault of the engine. But to be fair, the documentation for it could be better!
Those people would probably be better off going with Unity as there´s tons of premade plugins that make things easy.
It is effectively a race condition in which the advantages of using UE to save time/money/effort - and get features beyond team limitations - have to offset the downsides of no longer working with an empty blank canvas in which you are instead debugging a behemoth engine when things go wrong and it is automatically funnelling your design into its systems or taking you further into the technical guts of a 3rd party engine just to try and deliver something that on a blank canvas with your own flow control is relatively straight forward and lighting fast in game logic terms regardless of complexity.
 
It was removed from UE5 and has only just returned in 5.6, but this performance topic is largely about the last 3-5years in which the feature was removed and many games have unreliable and jitter performance on any hardware, rather than just lower performance like the PS3/360 gen
You sure? Seems it was possible in older 5.X versions too: https://github.com/ArtemIyX/BlueprintCompileUnreal
As for unreliable performance, I dont agree with that, that seems to be mainly a PC problem and very rarely happens on consoles.
I mean Fortnite runs a stable 60fps on a crappy Series S, but to be fair that game isnt really pushing any boundaries like AAA games do.
But that isn't "some skills" IMHO, if you have to write UE interfacing native code to replace Blueprints. That is highly advanced Blueprint profiling and replacing default engine features - which IMO shouldn't be required for an easy win.
Well sorry but thats an argument that doesnt make any sense. With Unreal being a generalistic engine you cant really demand it being optimized right out of the box for every possible scenario devs use it for.
That is just a completely unrealistic (haha get it) view! And yes, 99% of indie devs wont do that due to lack of time and resources, but it´s not like they have much choice if they want to make a game!
But this thread topic doesn't originate for 2D games, it is driven by 3D games, so out of the box performance for indie 3D games and even less specialist UE dev at bigger studios is the yardstick, no?
The problems in AAA UE5 games isnt caused by low skill developers - anyone working at a big studio has worked on multiple projects before and usually know what they are doing, otherwise they wouldnt get hired.
If a studio lacks Unreal knowledge they have more than enough choice on the market for hiring experts.
But no matter how skilled your devs are, its the publishers that often dont want to spend the money for optimizing games.
Building one engine per 2D game is likely to be far more economical - unless targeting mobile -and yield far more time on design/gameplay ideas than learning how to wrestle and work around systems that are irrelevant or bloated overkill in a 2D game.
It isnt, because you have to keep console TRC´s in mind if you want to publish your game on there. And failing submissions for that stuff is pretty easy!
That is one reason why so many indies pick Unity because it makes that stuff easy out of the box.
It is effectively a race condition in which the advantages of using UE to save time/money/effort - and get features beyond team limitations - have to offset the downsides of no longer working with an empty blank canvas in which you are instead debugging a behemoth engine when things go wrong and it is automatically funnelling your design into its systems or taking you further into the technical guts of a 3rd party engine just to try and deliver something that on a blank canvas with your own flow control is relatively straight forward and lighting fast in game logic terms regardless of complexity.
I think you seriously underestimate what it means making your own engine - at least if you dont just want to release only on PC!
 
I do appreciate the democratization of game development with Unreal Engine and Unity being as pervasive as they are. They are the reason indie games have been able to take off the way they have. Games like Expedition 33 and Hollow Knight Silksong are only possible because of Unreal and Unity.

But I think these engines are doing more harm ton the industry than not at this point, especially Unreal Engine. Unreal Engine is a massive, bloated piece of tech requiring an insane amount of overhead, especially if the games are utilizing some of its more sophisticated features, which makes games run horribly – not just on low end hardware with few spare resources such as the OG Switch or mobiles, but also, increasingly, on higher end hardware. Games are stuttery messes even on PS5 Pro or high end PCs, and shit like stutter and traversal stutter is a never ending nightmare at this point.

It's getting worse – the technical standard of games has gotten horrible, but more importantly, we are now reaching a point where an entire generation of developers in the industry only have the skill set to work with Unreal and Unity, and no skill set to work with proprietary tech stacks at all. This makes the problem self perpetuating, and it means unoptimized games that barely utilize the hardware they are running on are becoming more and more common as time goes on.

Is there no fix to the problem? I can't believe I am saying this, but I really wish publishers and developers would start using their own internal tech stacks and engines for development again.

Did you even watch the Witcher4 tech demo running on a PS5, that alone pretty much invalidates your argument not to mention without UE5 Expedition 33 would never exist, from a gamer perspective I don't see the issue if it allows smaller teams with tighter budgets to create great looking games that punch way above their studio size and enables the established big studios to create something hopefully close to Witcher4 visuals and complexity, brings to mind the old saying "Jack of all trades master of none" applying to someone versed on multiple graphics engines engines...
 
I'd say its the other way around, low talent goobers learn nothing but how to use unreal/unity and studios have no choice but to use it since those people are all they can find.
I would say there are a limited number of people with the high level ability and skills needed. Not every doctor has the abilities to be a brain surgeon no matter how hard they work and train.
 
Blueprint to C++ Nativization is indeed gone in UE5, and it has very little, or even nothing, to do with games not performing as well as they should.

ScHlAuChi ScHlAuChi What you posted is a non-Epic plugin to compile Blueprints; nativization which is what PaintTinJr PaintTinJr was talking about is not BPs compiling into BP VM code but into C++ code.
 
You sure? Seems it was possible in older 5.X versions too: https://github.com/ArtemIyX/BlueprintCompileUnreal
As for unreliable performance, I dont agree with that, that seems to be mainly a PC problem and very rarely happens on consoles.
I mean Fortnite runs a stable 60fps on a crappy Series S, but to be fair that game isnt really pushing any boundaries like AAA games do.
Not what Epic released as the "engine" AFAIK. Those versions compile to bytecode and run in a bytecode interpreter.
Well sorry but thats an argument that doesnt make any sense. With Unreal being a generalistic engine you cant really demand it being optimized right out of the box for every possible scenario devs use it for.
That is just a completely unrealistic (haha get it) view! And yes, 99% of indie devs wont do that due to lack of time and resources, but it´s not like they have much choice if they want to make a game!
No one is demanding it being optimised for their specific case, but interpreted UE5 game logic vs native UE4 is a fair criticism when games are typically single core bound and flow control runs on the primary core/thread of any system, so they did have a choice to not use UE5, and stick with UE4 which pertains to the thread topic IMO.
 
Not what Epic released as the "engine" AFAIK. Those versions compile to bytecode and run in a bytecode interpreter.

No one is demanding it being optimised for their specific case, but interpreted UE5 game logic vs native UE4 is a fair criticism when games are typically single core bound and flow control runs on the primary core/thread of any system, so they did have a choice to not use UE5, and stick with UE4 which pertains to the thread topic IMO.
Nativization really doesn't play a measurable role in a game's performance. It's just one more tool that didn't work that great to begin with to get some performance gains. But the game logic is rarely the reason a game is running badly; it's usually underlying tech that every game needs that doesn't work well at the scale the developer needs.
Aka a full animation setup for a character works decently enough with a few characters at once, but if you want to have dozens of characters on screen you will need additional logic to handle this.
Using nanite with a lot of assets that use sprites, introducing overdraw. Not using LODs, or not using instanced meshes etc.
Using overlay widgets in UMG/UI when not necessary, which introduces wasted draw calls. Not setting up texture streaming correctly.
Those and a million other things tank performance easily, and it's at scale where devs who don't know what they are doing use options without understanding the implications.

An engineer putting a single raycast on tick for the player character specifically in BP is not the reason for bad performance.

Game logic obviously _can_ tank performance just as any code in the engine can but any programmer that will make a game's perf go to shit by abusing raycasts in a loop on tick on 50 actors at once will find ways to make perf to go to shit with nativization too.
Any serious programmer in game dev using Unreal will write systems in C++ and configure the setup in Blueprints. The way you are describing it with "they could have sticked to UE4 which had nativization" as if that's the reason why some UE5 games have some performance issues is misleading.
Game dev is just complex, and there are a million screws to align with each other and no checklist that tells you "this is bad because of X and this is the solution Y".
Nativization was one of these screws, but a pretty minor one that didn't affect any semi-serious dev to begin with.
 
Nativization really doesn't play a measurable role in a game's performance. It's just one more tool that didn't work that great to begin with to get some performance gains. But the game logic is rarely the reason a game is running badly; it's usually underlying tech that every game needs that doesn't work well at the scale the developer needs.
Aka a full animation setup for a character works decently enough with a few characters at once, but if you want to have dozens of characters on screen you will need additional logic to handle this.
Using nanite with a lot of assets that use sprites, introducing overdraw. Not using LODs, or not using instanced meshes etc.
Using overlay widgets in UMG/UI when not necessary, which introduces wasted draw calls. Not setting up texture streaming correctly.
Those and a million other things tank performance easily, and it's at scale where devs who don't know what they are doing use options without understanding the implications.

An engineer putting a single raycast on tick for the player character specifically in BP is not the reason for bad performance.

Game logic obviously _can_ tank performance just as any code in the engine can but any programmer that will make a game's perf go to shit by abusing raycasts in a loop on tick on 50 actors at once will find ways to make perf to go to shit with nativization too.
Any serious programmer in game dev using Unreal will write systems in C++ and configure the setup in Blueprints. The way you are describing it with "they could have sticked to UE4 which had nativization" as if that's the reason why some UE5 games have some performance issues is misleading.
Game dev is just complex, and there are a million screws to align with each other and no checklist that tells you "this is bad because of X and this is the solution Y".
Nativization was one of these screws, but a pretty minor one that didn't affect any semi-serious dev to begin with.
Completely understand what you are saying, but other than nanite, why is there such a discrepancy between UE4 and UE5 games in the stability of the performance -where surely those developer skills misgivings should translate from UE4 to UE5 and just tank performance, rather than be unstable like it seems in many UE5 games, no?

The reason I had believed that the flow control with interpreted bytecodes vs native would have been a noticeable issue, was rooted in using Java two decades ago.

I realised that even with performance benchmarks showing great comparative performance for throughput with Java relative to C, it was actually the latency in the interfaces that really highlighted the performance difference in task switching.

So I've assumed the stability of the latency in the Blueprints interpreter would be similar, and as flow control is typica;;y single core/thread bound, I assumed that latency stability would cascade up the engine causing marginally bigger performance losses. But it is interesting that UE's VM doesn't have that issue and you have displace a misplaced view I've held of VM interpreters in general over the years.
 
Completely understand what you are saying, but other than nanite, why is there such a discrepancy between UE4 and UE5 games in the stability of the performance -where surely those developer skills misgivings should translate from UE4 to UE5 and just tank performance, rather than be unstable like it seems in many UE5 games, no?

The reason I had believed that the flow control with interpreted bytecodes vs native would have been a noticeable issue, was rooted in using Java two decades ago.

I realised that even with performance benchmarks showing great comparative performance for throughput with Java relative to C, it was actually the latency in the interfaces that really highlighted the performance difference in task switching.

So I've assumed the stability of the latency in the Blueprints interpreter would be similar, and as flow control is typica;;y single core/thread bound, I assumed that latency stability would cascade up the engine causing marginally bigger performance losses. But it is interesting that UE's VM doesn't have that issue and you have displace a misplaced view I've held of VM interpreters in general over the years.
Just to be clear, VMs _are indeed_ slower. Blueprint code runs slower than C++ code despite achieving the same thing. But if, say, you have a single metric (average fps) that you measure performance by, that in itself won't tell you why that is and you have to look closer: what's the CPU doing, what is the GPU doing and so on.
And chances are you will run into performance issues because of content way before you run into them because of your own code, if you are somewhat competent.
Code on the game layer, not the engine layer, can easily tank performance just the same if you are doing things the wrong way, and some things definitely should be done in C++ instead of BP (although BP is completely fine for the average indie game).
My point isn't that game logic, especially using BP, can't be a problem, but moreso that if you are semi-competent and follow best practices and have general awareness for code performance, this isn't the first problem perf-wise you'll run into when you make a new Unreal project.
It's C++ systems or GPU work that are made to do more lifting than they should by the users, and that's due to content being abused.

Some of the knowledge required is arcane, but a lot of it is ignored best practice.

I am not even sure that UE4 games performed better on average; UE5 games run fine for the most part although some games definitely don't.
That used to be the case in UE3 and UE4 games also though. To what degree this has changed, I can not say.
There is a perception of that in social media currently, but a lot of it is just voices screaming louder than before, or people grifting, such as Threat Interactive.
But let's say indeed average performance has gone down, more frame dips, more stutters and so on: then I'd say it's a combination of new technology having to go through a ripening process where systems aren't as optimized as they should be yet, and simply a misguided perception of the gamer populace.
For example, people can point to bad frames without frame generation and say "why the hell does it perform this badly without frame generation?" and continue playing without DLSS or frame gen etc.
But that's because they hold on to their outdated perception of how games are supposed to work.

There is no "supposed to work": it's all about tradeoffs. Which tradeoffs are chosen one might disagree with, but their existence is based in reality and can not simply be "fixed".
For example, if modern tech is disproportionally more expensive for modern hardware due to raytracing, one can either a) wait for hardware to catch up, b) not use raytracing as much as possible or c) use the proposed solutions such as DLSS and frame gen.
The idea of "fake frames" and "real frames" didn't exist before, but they are literally solutions to some of the frame problems. Solutions that one might not like, but they are solutions regardless.
It's not that people just made expensive systems and rendering for no reason, saw low frames, and called it a day. It's all an ecosystem working together.
Bright minds decided this is the best possible course, which doesn't mean it's ideal.

As for "where are things going wrong where developer skills or lack of skills should have translated from UE4 -> UE5" (implying the engine is at fault; arguably not 100% true but let's say it is):
Changes to technology require changes to workflows. For example, overdraw is and will continue to be a big problem in dense foliage environments.
But with Nanite one should handle foliage differently compared to before, namely to use geometry for leaves, not alpha cards. Alpha cards were costly in UE4 as well, but the strategy to work with them has changed between UE4 and UE5 with Nanite.
It's possible however that not everyone got the memo.

The same goes for Lumen. Lumen wants geometric thickness to solve light leaks. Previously in UE4, this often wasn't necessary and walls in interior spaces could be flat planes.
This also goes for temporal artefacts/smear etc.
There are ways to mitigate and work with them, but not everyone has made the transition to new workflows or paradigms.

That being said it's not just the developer's responsibility; or rather, it is because it's a professional effort, but in the complex world of modern game dev no one has a complete idea of all the details, so Epic can help improve the situation, and we do with every engine release.
 
UE engine was way worse in PS3/360 era when it comes to performance, in fact I would even go as far as say UE in Ps3/360 was ugly.
That was what I was alluding to with the 'browns' which was the biggest negative influence UE had that gen ;P Of course - it didn't help that there were no credible 3rd party alternatives - Open source options were bad to non existent, Unity didn't exist yet most of the gen, and CE didn't run on consoles until generation was over.

Also - performance was consistently bad across all tech-stacks, proprietary or not - you don't get 'average ps360 game' running at 25fps if only one stack was performing poorly. Sure - UE3 might have performed another 20% worse - but going from 25fps to 21fps is just different shades of bad, there are no good options.

To me gaming industry has its issue but I dont believe its "decaying" because we still amazing games.
Statistically - the industry is larger than its ever been (2000s were positively tiny in comparison), meaning we also get a lot more software released (eg. just in 2024 alone, Steam had more games released than all of PS360WiiNDSPSP combined in their lifetime). With that kind of volume it's inevitable we still get good stuff on regular basis - I think 'decay' is mostly aimed at propagation of certain business models and general decline of software optimization (which has been a universal trend for the past 20 years).

My point isn't that game logic, especially using BP, can't be a problem, but moreso that if you are semi-competent and follow best practices and have general awareness for code performance, this isn't the first problem perf-wise you'll run into when you make a new Unreal project.
I'd agree in principle - but worth noting that 'best practices' are not what they used to be, and awareness of performance even less so.
Nothing exemplifies this better than one of the tweets I've seen a few years back comparing costs of running a 'modern' UI framework (Flutter) on XSX vs. what Scaleform (or other homegrown Flash renderers) used to take on 360 - in CPU costs alone.
We're talking 10x worse performance on modern consoles - for 0 gain - flutter adds nothing that scaleform couldn't do for game UI development - only new (and arguably worse in some respects) tools (and possibly new people who have less familiarity with realtime performance working in games). Of course it has a working license - since Autodesk killed XF.
And this is on top of the fact UI framework performance already sucked in 360 era. Another Flash framework (before XF was really a thing) was literally responsible for an entire series of sequels running at 30fps instead of 60 in the PS2 era - until 2-3 years in when it got 'fixed'.

Also back in late 360 era I was on a project (not Unreal - but all the same principles applied) that swapped entire gameplay layer from script to native C++ 'for performance reasons'. It went about as well as you'd imagine (meaning not at all) - especially when you consider engineers that used to work in script land were now expected to write high performing C++ code.

But I do agree with you there are other - lower hanging performance gotchas you're likely to run into faster - where things get ugly is that these things are usually stacked - so even if you fix one, the other issues likely 'also' exist, it's why many games with poor performance have no simple way to dig themselves out of that hole - other than simply throwing more hardware at the problem that is.
 
Last edited:
I think 'decay' is mostly aimed at propagation of certain business models and general decline of software optimization (which has been a universal trend for the past 20 years).
The reason for this trend is pretty simple - games have become more and more expensive to make while the price for them pretty much stayed the same.

But another major reason for the percieved decay has to do with the rise of social media where negativity wins.
See people like Threat Interactive that make tons of money by peddling misinformation.
Other creators that equally have no clue join in on the bashing, because it creates clicks.
The actual truth doesnt matter anymore, only the percieved truth, as that means more engagement!
Algorithms love engagement! So if everyone says Unreal is bad, it must be true!
 
I'd agree in principle - but worth noting that 'best practices' are not what they used to be, and awareness of performance even less so.
Nothing exemplifies this better than one of the tweets I've seen a few years back comparing costs of running a 'modern' UI framework (Flutter) on XSX vs. what Scaleform (or other homegrown Flash renderers) used to take on 360 - in CPU costs alone.
We're talking 10x worse performance on modern consoles - for 0 gain - flutter adds nothing that scaleform couldn't do for game UI development - only new (and arguably worse in some respects) tools (and possibly new people who have less familiarity with realtime performance working in games). Of course it has a working license - since Autodesk killed XF.
And this is on top of the fact UI framework performance already sucked in 360 era. Another Flash framework (before XF was really a thing) was literally responsible for an entire series of sequels running at 30fps instead of 60 in the PS2 era - until 2-3 years in when it got 'fixed'.

Also back in late 360 era I was on a project (not Unreal - but all the same principles applied) that swapped entire gameplay layer from script to native C++ 'for performance reasons'. It went about as well as you'd imagine (meaning not at all) - especially when you consider engineers that used to work in script land were now expected to write high performing C++ code.

But I do agree with you there are other - lower hanging performance gotchas you're likely to run into faster - where things get ugly is that these things are usually stacked - so even if you fix one, the other issues likely 'also' exist, it's why many games with poor performance have no simple way to dig themselves out of that hole - other than simply throwing more hardware at the problem that is.
There are definitely parts of the tech stack that aren't ideal and can and should be improved. But I think the current 'programming landscape' in game dev looks quite a bit different from what you are talking about which indeed doesn't sound great.
"Game Programming" in Unreal is super accessible through Blueprints since most important systems are part of the engine. The core of best practices are honestly pretty simple and manifest in different ways that are not always obvious.
The core I'd say is:
- Don't do work (work = CPU work) you don't have to do more often than you need ( = save on CPU perf)
- Don't store data you can trivially fetch/calculate at a low cost ( = save on memory)
- Subdivide large tasks over time on the Game Thread (time slicing) or
- Offload tasks onto separate threads if you can
- Use streaming systems for large chunks of data where possible
etc.

Then a manifestation of this in Unreal is the nesting of multi-prepass widgets in Slate/UMG/UI.
Representing "don't do work you don't have to do".
There are some UI widgets that need to calculate data from the entire widget hierarchy 2 times; and if you nest them, this grows exponentially.
Here is a neat demonstration of a small UI setup and how it can affect performance.


From 577 microseconds (~0.5 milliseconds, so ~3% or 1/32 of a game's perf budget when targeting 60 fps (16.6ms per frame), or ~1.5% or 1/64 of a game's perf budget when targeting 30 fps (33.3ms per frame))
to 25 microseconds (0.025 milliseconds, so ~0.15% or 1/667 of a game's perf budget when targeting 60 fps.

It's a minor difference when setting it up and since it's user-friendly to set up (just click a few buttons) and you a) won't immediately notice that performance is dropping and b) if you see something as Slate Prepass taking 577microseconds per frame you won't know if this is a reasonable amount since you don't have a reference from the get go.
You just see a regular random engine internal call taking up a certain amount of time, but it is heavily influenced by the content you have set up.
And herein lies a big problem. If you aren't intimately familiar with Slate/UMG, and you are a UI artist with a little bit of programming experience, you might set up some UI but you do it in >very specific and wrong way<.
There are a million of these pitfalls and you have to be very dedicated to your job, watch these videos in your free time to amass a trivia knowledge base about these sorts of pitfalls.
Of course, in an ideal world, the game dev process has content review and seniors or leads know every little of these pitfalls and take measures to prevent them, but reality is not ideal.
Even competent leads might know a bunch of these pitfalls and address them correctly, but the remaining ones can go by unnoticed and will be hard to track down.
 
Last edited:
Top Bottom