Unreal Engine is slowly causing the decay of the industry, what can be done?

What is funny to me, that Epic has all that engine coders and maintenence, yet CDPR is fixing stuff for them lmao, it seems like engine just got out of beta phase imho.

It has it's issues but hopefully things will improve down the line, since it's a powerful tool for AA devs. Remember my dudes, art direction is everything. Engine is definitely flexible enough to make distinct looking games so it's not like UE3 phase.

But they should absolutely optimize it still to the point it IQ looks good and stutters are resolved. It will be absolutely funny to me, if CDPR manages that with Witcher 4 and all the optimizations they made will get merged into the main engine branch and Epic themselves will act as they have figured it out lmao. Too bad games with older engine versions will suffer still.

Also I feel like when cancel culture was at it's peak lots of capable devs said "I ain't got time for that" and left the industry. I'm curious to see what OGs with their AA studios will make. Like Raphael Colantonio or Chris Avalone with that Quantic Dream ex-devs.
 
Last edited:
bEEtU4r.png






This generation really brought a bunch of armchair devs out the woodwork.




The fact Exp33 exists is enough for me to forgive all transgressions Epic and UE may have done.
 
I'm no expert on the topic because I stick to developing 2D games.

But maybe UE5 should start new projects with all the bells and whistles deactivated. Then only devs who know their stuff gradually add the advanced "bloat" when needed.
 
Something is not adding up though. Why are some games much worser than others. That sounds like a developer problem.
 
Something is not adding up though. Why are some games much worser than others. That sounds like a developer problem.

Exactly.
Exp33 came out great with way less devs than bigger studios who bring out badly performing games.
 
An unoptimized game is not the fault of the engine, its the developer.
Tim Epic says it's the fault of the devs, the devs (of Borderlands 4) say it's the fault of gamers not buying hardware powerful enough to run the games, and the gamers blame the engine, so we circle back endlessly
 
Last edited:
What is funny to me, that Epic has all that engine coders and maintenence, yet CDPR is fixing stuff for them lmao, it seems like engine just got out of beta phase imho.

It has it's issues but hopefully things will improve down the line, since it's a powerful tool for AA devs. Remember my dudes, art direction is everything. Engine is definitely flexible enough to make distinct looking games so it's not like UE3 phase.

But they should absolutely optimize it still to the point it IQ looks good and stutters are resolved. It will be absolutely funny to me, if CDPR manages that with Witcher 4 and all the optimizations they made will get merged into the main engine branch and Epic themselves will act as they have figured it out lmao. Too bad games with older engine versions will suffer still.

Also I feel like when cancel culture was at it's peak lots of capable devs said "I ain't got time for that" and left the industry. I'm curious to see what OGs with their AA studios will make. Like Raphael Colantonio or Chris Avalone with that Quantic Dream ex-devs.

UE5 is an amazing, but extremely complex piece of tech. Free for everyone and compared to Unity it's very bloated.
But the money is at UEFN. Epic pays millions to streamers making boxfight maps in Fortnite. Because that's what the Kids want.
 
Feels like a very dramatic statement to me.

Install game, let nvidia app determine settings, play…. Has worked for me fine across a few UE5 games and I'm just trying n a gaming laptop. Games look great and play great.

Feel like people always try to push max settings when sometimes they are just kind of shit implementations.

That is true to a certain extent but Unreal Engine 5 (and 4) have a very serious problem with stuttering that is worse on PC because consoles use fixed hardware and ship the compiled shaders with the game code and updates. Even then consoles still suffer from traversal stutters - just play Oblivion Remastered on PS5 Pro and tell me that is running smoothly! - and that's without even mentioning sub-optimal performance (e.g. poorly framepaced 30 fps caps and a failure to lock to 60 fps, again, like Oblivion Remastered, Silent Hill 2 Remake or Metal Gear Solid Delta on PS5 Pro).

Shader compilation and traversal stutters are what annoy me the most about UE4 and UE5 games far more than sub-optimal performance because I can at least fix the latter issue by lowering settings and/or using upscaling but I cannot fix stuttering. Stuttering will happen regardless of how powerful your PC is with more powerful CPUs making it less noticeable but it will still be there unless you cap the game at a very low framerate (I had to do this to play the awesome but woefully coded Star Wars Jedi: Survivor on PC for example).

I trick I use with many UE4 games on PC is to force DX11 mode when they default to DX12. This means I lose some performance but because DX11 does not require shaders to be pre-compiled, the games typically run much better without any shader compilation stutter. They can still be traversal stutter, unfortunately, but at least the game is playable for me. Unfortunately, you cannot use this trick with UE5 games because they rely on DX12 for features such as Nanite (tessellated geometry), Lumen (software ray-traced lighting) and VSMs (virtual shadow maps).
 
I hear ya, but I will say Mafia The Old Country is UE5 and it runs and looks amazing so maybe over time devs will get a hang of it as UE5 keeps improving? Anyway that's my hope.

Altho I'm still shook that UE5 causes stutters in PC games, how have they not fixed that yet? seems so glaring.
 
All while f.ex. Kingdom Come Deliverance 2 with good fidelity and performance in a large open world runs almost stutter-free on modified CryEngine...

Smart devs taking the time to learn an engine and modify to their needs.

It takes nerds in your studio to make it happen, that's pretty much against the UE5 ecosystem where accountants are gleefully firing the senior and engine nerds to save a buck because who needs those when everyone can jump in UE5

It's not an engine problem, it's an ecosystem problem.
 
What is funny to me, that Epic has all that engine coders and maintenence, yet CDPR is fixing stuff for them lmao, it seems like engine just got out of beta phase imho.

It has it's issues but hopefully things will improve down the line, since it's a powerful tool for AA devs. Remember my dudes, art direction is everything. Engine is definitely flexible enough to make distinct looking games so it's not like UE3 phase.

But they should absolutely optimize it still to the point it IQ looks good and stutters are resolved. It will be absolutely funny to me, if CDPR manages that with Witcher 4 and all the optimizations they made will get merged into the main engine branch and Epic themselves will act as they have figured it out lmao. Too bad games with older engine versions will suffer still.

Also I feel like when cancel culture was at it's peak lots of capable devs said "I ain't got time for that" and left the industry. I'm curious to see what OGs with their AA studios will make. Like Raphael Colantonio or Chris Avalone with that Quantic Dream ex-devs.

UE5 has had a messy release.

The initial build 5.0 had no support for Nanite for foliage and trees and this wasn't added until 5.2 or 5.3. Later builds of the engine also added workarounds for shader compilation and traversal stutter from 5.4, I believe. The problem is that games take so long to develop that it means we have been getting these flawed earlier builds of UE5 in many cases, and only now are we seeing newer builds being used. Not every developer will bother to update their engine to the latest version and far too many developers ignore issues like stuttering all together which isn't great.

I agree that UE5 often feels like a beta build because of this, especially with the prevalence of stuttering in the engine, something that I feel has got worse over the years. At least with UE4 and DX11 we only had to endure the odd traversal stutter... now we have to contend with that and shader compilation stutter plus games that are so poorly optimised that you have to use upscaling from 1080p or lower resolutions on 1440p and 4K displays to have a hopefully post-60 fps experience.

I'm amazed that Epic aren't more proactive about fixing the issues that have been around for 5+ years now. If I was them then I'd be extremely embarrassed at the reputation the engine has gained. We should be celebrating how fantastic the games look, not lamenting about how poorly they run...
 
Last edited:
The problem is the engine itself. We can't fix it, only epic can.

The problem is the engine can be used to create work that cannot then be corrected afterwards. If a flaw was created upon the initial thought process for the game it cannot be corrected without redoing all the work.

Imagine if a consumer level product was released in this state. It would be considered broken
. Imagine if you had to manually rewrite a paper to add page numbers to it.

The proper fix is to allow the code entered into the application to be app settings agnostic. Then allow the settings to dynamically apply to the work product. This would allow the work product itself to be portable and not tied to errors made before the project even began. This will also prevent a finished product from being able to be fixed. I know nothing about this technically and this is just a guess, but the next iteration of the engine will likely do just that. These layers need to be automated and independent. With AI this should now be possible. Epic caused this epic has to fix it. When they tell us the issue is inexperienced devs, the software needs to be based around the least experienced dev. It should and indeed does not need to require high level expertise to implement properly and it does not need to cause technical issues in finished software based off not having proper tutorials, warnings, and safeguards in place. This is a problem that must be and will be fixed in the software itself. We can't do anything about it since the default engine has already been chosen, we just have to wait for updates and dev cycles to get out of this.
With all due respect but everything bolded makes zero sense at all, and the yellow part is obvious; so why speak like you have any idea whatsoever?

You brought up one interesting point however:
Unreal Engine is professional grade software. It's easy to mistake it for consumer level software, as UE has become quite ubiquitous and on a surface level seems accessible enough for anyone to get started, and the occasional small indie game made by inexperienced people makes you think otherwise.
But it remains professional grade software regardless, and this means you need experienced experts to really get the most out of it.
 
Last edited:
Highly CPU limited, bad RT implementation, has quite a last gen look overall with low geo complexity and apparent LODs.
I do wonder sometime if people hating on UE5 have actually played games made on other engines as much as they suggest.
The problem with UE5 isn't in the engine, it's in the approach of the developers to the engine where studios simply don't invest enough in their own programming thinking that the engine will "automagically" handle everything they'll through on it.
I don't know why your taking my comment of wanting to see a specific engine be used more and painting it across the entire industry as if I want it to become the standard.

Your whole post can be summed up to different engines are better at different tasks than others. The the non open world, detailed, and NPC interactions we see with Hitman and soon 007, the engine would fit for a lot of styles of games.
 
Engine is definitely flexible enough to make distinct looking games so it's not like UE3 phase.
Never was an engine problem if games look samey.
Gears of War != Mirrors Edge != Waves (Indie Geometry Wars) != Deadlight != Borderlands 1+2 != Fat Princess != Outlast != Rocket League

Bloom shader brown and grey action stuff (Gears, military shooters) was successfull in its time, so a lot of games went that way, but the UE3 could do anything what the devs were capable of envisioning and creating.

No one forced devs to migrate to UE5 if UE4 has fewer problems and or is more suited out of the box for their project, but for some reason it was almost a race for some. I wonder why that was/is. Since I think both are under the same license agreement and cost the same, with UE4 having naturally more devs already familiar with it. Unreal being bascially free, until you make profits, was started with UE4, right?
 
UE5's cutting-edge features are incredibly expensive on runtime compute resources, but the differences they deliver don't often feel revelatory, because developers tend to run all of them at the bare minimum rather than turning a few settings all the way up and letting that define the game's look.

Limitation has historically driven innovation, but that doesn't seem to be happening this generation. Instead of trying to do everything at once, developers should be looking to go all in one or two key areas: lighting complexity, animations, AI, procedural systems, world complexity. Take those one or two areas and make them part of your game's visual signature. It'll make it look far more impressive and far more distinctive to take two key features and turn them up to six or seven than than run five or six at about one each.

And this is why games are looking same-y in some cases: all targeting the same hardware, using the same engine and running all the same features at roughly the same fidelity. Consider how cool and distinctive Quake II RTX looks and how much better it could be if the game had been designed with that kind of lighting fidelity in mind - how much it might have informed level, encounter, enemy and puzzle design throughout the whole game if you'd have known in advance how much control you'd have over lighting, shadows and reflections. Quake II RTX runs solidly on a 3070 and is even playable on a 2060 and has a genuine next-gen feel, even though it's 30 years old.
 
Last edited:
That is true to a certain extent but Unreal Engine 5 (and 4) have a very serious problem with stuttering that is worse on PC because consoles use fixed hardware and ship the compiled shaders with the game code and updates. Even then consoles still suffer from traversal stutters - just play Oblivion Remastered on PS5 Pro and tell me that is running smoothly! - and that's without even mentioning sub-optimal performance (e.g. poorly framepaced 30 fps caps and a failure to lock to 60 fps, again, like Oblivion Remastered, Silent Hill 2 Remake or Metal Gear Solid Delta on PS5 Pro).

Shader compilation and traversal stutters are what annoy me the most about UE4 and UE5 games far more than sub-optimal performance because I can at least fix the latter issue by lowering settings and/or using upscaling but I cannot fix stuttering. Stuttering will happen regardless of how powerful your PC is with more powerful CPUs making it less noticeable but it will still be there unless you cap the game at a very low framerate (I had to do this to play the awesome but woefully coded Star Wars Jedi: Survivor on PC for example).

I trick I use with many UE4 games on PC is to force DX11 mode when they default to DX12. This means I lose some performance but because DX11 does not require shaders to be pre-compiled, the games typically run much better without any shader compilation stutter. They can still be traversal stutter, unfortunately, but at least the game is playable for me. Unfortunately, you cannot use this trick with UE5 games because they rely on DX12 for features such as Nanite (tessellated geometry), Lumen (software ray-traced lighting) and VSMs (virtual shadow maps).
no one other than Tim Sweeney is denying that UE5 has issues. Epic has consistently improved performance of Lumen from UE5.1 to 2 then 4 and now 5.6 literally doubling the performance of hardware lumen. They have also improved the CPU bottlenecks by 50%. So clearly the engine had issues.

But those issues were addressed by March 2024 when Ue5.4 was released, and oblivion developers chose not to use that version despite having a full year of dev time still left. Expedition 33 came out literally the same day and while they only had 4 programmers, they were able to upgrade to UE5.4 and had no issues with the CPU bottlenecks we saw in oblivion remastered.

Borderlands 4 is built on UE5.4 and does not have the same CPU issues. It's just a heavy game on the GPU. It's also not using hardware lumen so all of the software lumen performance improvements they made from UE5.1 to UE5.4 are in there. The game is just expensive on the GPU which has nothing to do with UE5 and everything to do with the developer going a bit too ambitious.

There are no traversal stutters either because once again, they were sorted out in UE5.4. There are some shader compilation stutters but very minor and are more on the dev who clearly missed some of the shaders. Not an issue on consoles.
 
UE5's cutting-edge features are incredibly expensive on runtime compute resources, but the differences they deliver don't often feel revelatory, because developers tend to run all of them at the bare minimum rather than turning a few settings all the way up and letting that define the game's look.
UE5 demo iirc was on a ps5 and at 1440p and 30fps. But devs decided to take all those features and then run the game at 60fps which means a very low resolution, bad IQ, and a really shitty upscaler like FSR1, so the entire game looks like garbage despite using all the new features.

I've been impressed with games like Indiana Jones, Forza Horizon 5, Battlefield 6, etc., all games that are not UE5 and also do nto use all these new features. I wonder why it is so important we must have lumen or nanite or RT or whatever especially if it means the game looks and runs worse than it could with smart art design and use of old-but-efficient techniques to make the game run better.
 
All Ubisoft engines runs great imo. Snowdrop, Anvil etc.
they both literally have the same profile as UE5 games with software lumen.

Avatar drops to 720p and 51 fps.
Outlaws has a similar performance profile.

only difference is that they are using hardware lumen while most UE5 games have used software lumen on consoles. On PC, in games where ive been able to toggle hardware lumen, they perform virtually identical to hardware RT games like avatar, outlaws and AC shadows. ID tech games like Indiana Jones and Doom have a higher performance profile but only because they look last gen despite using RTGI.
 
no one other than Tim Sweeney is denying that UE5 has issues. Epic has consistently improved performance of Lumen from UE5.1 to 2 then 4 and now 5.6 literally doubling the performance of hardware lumen. They have also improved the CPU bottlenecks by 50%. So clearly the engine had issues.
every game ever has room for improvement, doesn't mean it has "issues". It's the job of each dev team to bring their game to a level that does not feel like open beta test or reminding us on what was normal/acceptable in Starfox and Goldeneye days. Engines like games are forever changing and improving with time and never truly finished/perfect. Both just need time to cook, until feature complete and running well enough and that cannot be rushed. With increasing complexity a good dev needs to know in time when the current state of things is not close enough to be good that in say remaining 6 months till release it won't just magically run buttery smooth and start to downgrade it from the bullshots they presented earlier. Which of course is another path of problems and backlash.
 
Gaming visuals are now very technically complex, you can't expect devs to make their own 3D engines from scratch like they did in the past.

Anyone remember all the ugly Renderware PS2 games?
Wasn't gta using that? I know burnout 3 was as well as the persona games. Dont see what's wrong with it. Most devs modified it in house.

Red engine powered cyberpunk and Witcher and it rocked for visuals. Same with the re engine by Capcom resident evil, dragons dogma2, monster hunter etc..

The problem with unreal is everything for that Gen of the engine always seems to have a similar look. Don't know why. I remember unreal 3 and all the game characters looking like Marcus Fenix. Unreal tournament 3 all characters were big and bulky, or had a wet plastic look.

Having different engines means different techniques and styles where unreal everything will be similar, unless they customize the engine.
 
You've got AI rearing its ugly head. Oversized budgets wasted on experiences with little entertainment value or appeal. Sequelitis with dumbed down and bloated content. Game content being gated and cut up into pieces to promote more recurring purchases. A bunch of clueless CEOs attempting to catch lightning in a bottle with absurd budget allocations and you have the audacity to blame the industry's troubles on an engine update that has brought production quality up, production time down, and has provided entertainment to the masses?
 
Last edited:
People is approaching this the wrong way, with sentences like

"Yes but imagine the time and investment if developers had to do their own engines"

The problem is most at some point did have their own engines or alternatives but the industry has been slowly ditching them in favour of the over bloated UE, and of course now they would have to make one from scratch, but that wouldn't be the case if they continued developing them, just look at UBISOFT, as much hate as they get they have been supporting and updating theirs and they have feature party ( and better performance) as UE, the ones to blame are the ones that ditched their own custom engines in favour of this one. It is their fault.
 
Last edited:
So... it doesn't have CPU bottlenecks, but performance is bad anyway? Or one of you is wrong? Who should I believe?
It may have no CPU bottlenecks, but the game drops down to like 800p on PS5 Performance Mode while not looking particularly great and featuring mostly constrained environments with very little happening. The IQ is also blurry and fuzzy.

It's a valiant effort for a small studio, but I wouldn't use it as an example of a well-optimized and performant UE5 game.
 
Last edited:
It may have no CPU bottlenecks, but the game drops down to like 800p on PS5 Performance Mode while not looking particularly great and featuring mostly constrained environments with very little happening. The IQ is also blurry and fuzzy.

It's a valiant effort for a small studio, but I wouldn't use it as an example of a well-optimized and performing UE5 game.

Timestamped:


You are both correct, then.

They even seem to have VSMs on. I'm honestly puzzled that a game with so many people in a town or so many particle effects during combat has no CPU bottleneck, but MGS 4 and Mafia does with a single explosion. I believe those are on 5.4 as well?

That has to be bad optimization right?
 
Highly CPU limited, bad RT implementation, has quite a last gen look overall with low geo complexity and apparent LODs.
I gave a like to this post but only for the comments quoted below. Couldnt disagree more with the comments above.

I dont think we can say that the engine is highly CPU limited just because its pushing physics we have simply not seen in other games. Including UE5 games. DF saw it at gamescom running on a PC and it was running at 60 fps. Maybe if Massive, Anvil, Northlight, and UE5 were pushing physics like the ones we saw then sure we could blame the engine, but we havent.

I dont know how you can say it has a bad RT implementation or that it has low geometry. LODs fine. Maybe they dont have nanite but you also really see it when driving at fast speeds and its on par with what ive seen in Mafia while driving and Avatar while flying. It could just be that the PS5 footage we saw has a shorter draw distance. On PC im sure we will see less pop-in. it's true for borderlands PC as well despite having nanite.

I guess graphics are subjective because i think they are pushing an incredible amount of detail and NPCs with a stunning lighting implementation. It is one of the most impressive things ive seen this.

u8dIOKz.gif


4IVOAtE.gif


vgMaCyp.gif

I do wonder sometime if people hating on UE5 have actually played games made on other engines as much as they suggest.
The problem with UE5 isn't in the engine, it's in the approach of the developers to the engine where studios simply don't invest enough in their own programming thinking that the engine will "automagically" handle everything they'll through on it.
 
You are both correct, then.

They even seem to have VSMs on. I'm honestly puzzled that a game with so many people in a town or so many particle effects during combat has no CPU bottleneck, but MGS 4 and Mafia does with a single explosion. I believe those are on 5.4 as well?

That has to be bad optimization right?
yeah, i tried to explain in the MGS thread that no UE5 game drops to 30 fps just because of an explosion. That it was just a developer issue likely stemming from the base ps2 game engine running underneath. I also didnt get those drops on PC when i exploded barrels. its just a bug.

Mafia is on 5.4 and actually performs very well on consoles. It is GPU limited in that game is just too expensive when lots is going on. You can watch the DF review and they say that its 60 fps most of the time unless you get into these big car chases and it starts to drop frames. that happens on PC too. it is what it is. As games push more visual features, the framerates fail to stay stable.

Expedition 33 is performing just like older UE5 games in terms of GPU load. most UE5 games drop to 864p. Sometimes 720p. Just like every other modern game. I compiled a list of games just a few weeks ago that showed Avatar, AC Shadows, Star Wars outlaws, Dragon Age Veilguard, Star Wars jedi Survivor, FF16, Alan Wake 2 and virtually every single current gen (not cross gen or sony current gen) dropping to 720-864p on consoles. It is literally the norm across all games.

If you want 1440p 60 fps, you can play Playstation games like DS2, Demon Souls, Astrobot and im guessing the upcoming Ghosts. Spiderman 2 and Ratchet both drop to 1080p in their 60 fps modes and they are only pushing one RT mode. With no nanite/RTGI/Lumen/Mesh Shader support. I can promise you that the moment Sony studios start pushing RTGI and add mesh shader support, they will also drop to 864p with further drops to 720p.

last gen, we had AC Unity running at 900p 30 fps on both hardware. And that was 1 year after launch. 5 years after launch, RDR2 ran at 900p on X1. Now 5 years after launch we are getting next gen graphics, at 60 fps albeit at similar resolutions and we are upset. You cant have it both ways.
 
Last edited:
Invest in better hardware for one. I recommend at least an RTX 5070 16GB or possibly even a 24GB VRAM card for next-gen. Secondly, don't expect 60fps for every game. Some games just aren't built around that target.

There are some issues that need to be sorted on Epic and Microsoft's end (shaders!) but overall the engine's ease of use has allowed for some incredible games to exist that couldn't have otherwise.
DX12 really dropped the ball, that's why I'm better on Linux, FF16 is the only exception of game running better on Windows as of now
 
No one force the developers to use UE5. We can agree or disagree about his value but blame the engine? They have just to use an other one. The thing is there aren't engine with the same level graphic fidelity and also easy to push , so UE5 it's the only choice. But developers can freerly opt to other alternative.
 
Last edited:
Exactly. The consumer is always right and as a consumer, if I see most UE5 games performing like shit then I'll just stop buying the games made with it.
This is false. The consumer (same as the customer) is not always right.

We are in 2025 and people still believe that this early 1900's marketing slogan is true when it's been proven false in business over and over again.
 
Last edited:
Im playing currently Mgs Delta and it has the worst hdr implementation I've seen long time. Generally most ue5 games has poor hdr, wonder if its devs skill issue as majority of ue5 game devs are mostly clueless or its indeed ue5 problem
 
Last edited:
Exactly. The consumer is always right and as a consumer, if I see most UE5 games performing like shit then I'll just stop buying the games made with it.
I don't agree with that, in fact there are times a game gets ruined because they listening to "consumer" too much.
 
Unreal Engine have issues but imagine the dev times if devs made their own engines instead. We're already at the point where 5 years is considered normal.
Rebellion makes their own engine and releases a game every 2 years. Got any other legit reasons? Skill issue is my guess. Remedy has a decent release cadence and they're also in-house. Really it's the UE5 studios having the problems.
 
Last edited:
E33 has shitty IQ and bad performance, so not sure what you mean.

Its a brilliant game, that would likely be impossible for the devs to have put together if they also had to make a game engine.

DLSSSwap to 310.2 preset K and it performs and looks amazing. (Im not patching the game for Medalum)

clair-obscur-expedition-33-maelle-with-medalum.jpg


Even my shitty 12400 easily locks 60.
 
Last edited:
What is it about UE5 that stops devs from making true S Tier Classics? Maybe I'm overlooking one, but I don't think any other engine has as many releases without a 10/10. Why is that?
 
I wonder why it is so important we must have lumen or nanite or RT or whatever especially if it means the game looks and runs worse than it could with smart art design and use of old-but-efficient techniques to make the game run better.
You wonder because you dont understand how games are made.
Old engines used baked lighting, which meant whenever you moved any lightsource the lighting in the whole level had to be rebuilt - every single time!
This was a huge waste of time - the move to RT was a gigantic gain of efficiency for art and design.
You simply cant develop any modern AAA games with their huge amount of content the old way - there is no going back!
 
No one force the developers to use UE5. They have just to use an other one. But developers can freerly opt to other alternative.
Good idea, please name those alternatives!
Which other engines beside Unreal, Unity and CryEngine are actually licenceable by third party devs?
 
Good idea, please name those alternatives!
Which other engines beside Unreal, Unity and CryEngine are actually licenceable by third party devs?
Well you named 2 already but they can also create their own engine. Anyway I already said UE5 is the more viable solution you should care more to read all I said not just what make you jump to the chair.
 
Last edited:
Top Bottom