Borderlands 4 Launches To Mostly Negative Steam Reviews Over Performance Issues And Crashing

Last nights updated did improve perf for me further, Easy pushing some 90fps now in open areas and combat, was getting around 70-80 before
 
The RTX5090 has $2000 msrp and it's actually available at that price right now. I know that's still expensive, but it's nowhere near $5000 as you said.


ym18jbW2roZdM6ru.jpg


Moreover, the RTX 5090 certainly isn't struggling to run Borderlands 4 at 1080p60fps, as my 4080S can already achieve a locked 60fps with Badass (maxed out) settings at 1080p. I'm guessing you watched YouTubers who showed you Borderlands 4 performance while it was still compiling its shades in the background. When I first started playing the game, the performance was much worse for around 5–10 minutes, but soon I was getting 95 fps instead of 45 fps in exactly the same place. Even if you just change the graphics settings you need to wait few minutes before performance will improve.


Here's 1080p DLAA (native) with badass (maxed out) settings on my RTX4080S. Keep in mind the RTX5090 is twice as fast (and two times as power hungry and three times as expenisve :P).


1080p-1.jpg


2.jpg


3.jpg


4.jpg



At 1440p with DLAA (native) and high settings, the game runs below 60 fps on my PC. However, I think the 5090 could handle these settings and maintain a consistent 60 fps, because even my 4080S isnt that far from that target,


1440p-DLAA-badass.jpg


1440p-DLAA-badass-2.jpg



With DLSS-Quality Borderlands 4 already runs at well over 60fps at 1440p and badass settings.


1440p-DLSSQ-badass-2.jpg


1440p-DLSSQ-badass.jpg



WIth FGx2 on top of that. The game feels smooth as butter at 138fps and I measured a latency of between 32 and 36 ms. Without FGx2 latency was 28-32ms, so not big difference and you get much smoother and sharper image during motion. It's also easier to aim compared to base framerate, so IMO DLSS FGx2 is a must in this game.

1440p-DLSSQ-4.jpg


1440p-DLSSQ-5.jpg

Borderlands4-2025-09-14-11-29-28-697.jpg


Here's comparison between badass vs high vs medium settings at 1440p DLSSQ + FGx2:


On my old 2560x1440 LCD I would probably play with badass settings, but on 4K OLED monitor I want much sharper image, so I need to lower some settings.

I need to use medium settings at to get similar framerate at 4K DLSSQ (75-90fps)


With FGx2 I get around 130-150fps


I can also use the high settings preset and achieve a similar framerate by choosing DLSSPerformance instead of DLSSQuality.


Here's 4K DLSS Ultra Performance with high settings and FGx2 (170-180fps).


4K Ultra Performance with Medium Settings and FGx2 190fps. I wouldn't be surprised if the PS5's image quality looked worse than that. Perhaps a kind PS5 owner will share a screenshot taken in exactly the same spot for a proper comparison. I didn't buy that awesome console, so I can't post my own comparison.


I dont want to defend Borderlands 4, becasue this game is much more demanding as a typical UE5 game and in my opinion, the graphics do not justify the requirements. Some of the assets in the game, especially the textures and trees reminds me Xbox Classic games, so not even X360.


The lighting can looks stunning at times, but I think it would be possible to achieve similar look with much lower cost. Borderlands 4 is almost as demanding as PT games and that shouldn't be the case, because "lumen" supposed to be faster than standard RT and that was the whole point of it. However, thanks to the AI, even this demanding game is perfectly playable on PC. I'm totally convinced that I'm playing at 4K with 130–140 fps, and that's what matters to me. Sure, 4K DLAA (native) and real 240fps would be even better, but it wouldnt change my experience that much. I paid for Tensor Cores and I dont mind using the AI technology that my graphics card provides.

1440p-DLAA-badass-2.jpg


This is just ridiculous.. 50 fps at 1440p on a 4080 Super powerhouse card!


Here's 4060 which is ~3 times slower running an actually optimized game.

120 fps on a 3 times slower card in a game that looks better:

0yn684himi346.png


 
1440p-DLAA-badass-2.jpg


This is just ridiculous.. 50 fps at 1440p on a 4080 Super powerhouse card!


Here's 4060 which is ~3 times slower running an actually optimized game.

120 fps on a 3 times slower card in a game that looks better:

0yn684himi346.png



I can understand your perspective because Borderlands 3 runs at 260–280 fps on my PC at native 1440p instead of 50 fps, and the graphics don't look five times worse to me. Dynamic GI isn't going to dramatically improve the graphics of games with a cell-shaded aesthetic, and I think pre-baked lighting wouldn't look significantly worse. Gearbox probably chose dynamic GI just to drastically reduce the size of the game and the development time.

On the other hand, the volumetric lighting in Borderlands 4 look stunning. I often stop playing this game for a moment just to admire the sun's god rays. There are also far more objects on the screen compared to Borderlands 3, particularly grass.
 
Last edited:
Never likes BL artwork, looks like cheap attempt at celshading.

Cant believe how shitty it's performance with this kind of graphics
It actually looks pretty incredible in spots, and base ps5 performance has been perfect. It's just a PC issue right now, but they are already pushing a patch. Definitely not an excuse for it, but it seems they were close to fixing it up for launch on PC and must of ran into a roadblock, but not enough to warrant delaying it everywhere for it.
 
I'm way more curious to WTF switch 2 version is like compared considering the ~30 fps while seeing monster GPUs being kneecapped
here is a side by side comparison between XSS vs XSX, its not looking good for NS2 taking into account that the system is weaker than the XSS.

 
5070 - 33fps at only 1440p
17 fps at 4K...

What settings do consoles even use? Need DF comparison ASAP.
From a quick glance, medium-high settings. Series S on the lower end, but still noticeably better than PC Low.

The game is running much better on consoles. Console equivalent PC hardware will be far, far below 60 FPS on these settings.
 
From a quick glance, medium-high settings. Series S on the lower end, but still noticeably better than PC Low.

The game is running much better on consoles. Console equivalent PC hardware will be far, far below 60 FPS on these settings.
you wish

 
Last edited:

Can't you read? That's frame gen on.
The latency will be absolutely horrible. Only turn frame gen on if you have atleast 60 FPS. To make matters worse, that is DLSS Quality at 1080p (which is internally 720p. The PS5/XSX will certainly use a higher resolution than that.)

And consoles are not using frame gen.

What a stupid post. You just proved my point.
 
Last edited:
From a quick glance, medium-high settings. Series S on the lower end, but still noticeably better than PC Low.

The game is running much better on consoles. Console equivalent PC hardware will be far, far below 60 FPS on these settings.
Yea nah medium, and many on low with hardware lumen disabled is my guess. hardware lumen might be the culprit for horrendous perf.
 
Can't you read? That's frame gen on.
The latency will be absolutely horrible. Only turn frame gen on if you have atleast 60 FPS.

And consoles are not using frame gen.

What a stupid post.
you said "far far below 60fps on these settings" and that's a lie

but i ll ask you this, dont you have eyes ??? he is getting 100fps with frame gen on, so his frames must be around 50 ( it ll have low latency ) on medium 1080p

Also console cant stay locked at 60fps or 30fps and thats worse than having frame gen on at 50fps.

Also 2 "Series S on the lower end, but still noticeably better than PC Low" Series S is running at 630p@30fps, any PC running this at low 1080p@30fps is already running it better than Series S and i m the one making stupid post out here.
 
Last edited:
you said "far far worse" and that's a lie

but i ll ask you this, dont you have eyes ??? he is getting 100fps with frame gen on, so his frames must be around 50 ( it ll have low input lag ) on medium 1080p

Also console can stay locked at 60fps or 30 thats worse than having frame gen on.
No that's not how it works.

Frame gen is NOT free. It's not a 2x on performance. If he has 100 FPS with FG on, he should have around 43 FPS real fps. Which is infact, far far below 60 FPS. And this is not 1080p, it is 720p internally because it is using DLSS Quality. The consoles are likely upscaling something like 820p to 1440p, or in the best case 1080p to 4K.

The PC optimization in this title is dogshit, don't try to defend it.
 
Last edited:
Also 2 "Series S on the lower end, but still noticeably better than PC Low" Series S is running at 630p@30fps, any PC running this at low 1080p@30fps is already running it better than Series S and i m the one making stupid post out here.
Please, I'm not talking about resolution here. Obviously DLSS at 1080p will look sharper than whatever the Series S is doing. That is not the point.

The point is that graphics effect such as grass density or the lighting will be significantly reduced on PC low compared to Series S. So you get worse graphics than that console.
RsgBvYxjyk5tm0ZV.png


fRvjEgVazWbeSFTE.png


Below is PC low. Above Series S.

Series S has much more grass and much better lighting. PC Low looks like a PS2 game.
 
Last edited:
here is a side by side comparison between XSS vs XSX, its not looking good for NS2 taking into account that the system is weaker than the XSS.



I hope that the difference outside for series S is a completely different time of day compared to dusk on Xeries X?

It seems they didn't trop Lumen, maybe a lower quality version of it.

It's not looking good for Switch 2 of course especially since the game really seem badly optimized.
 
No that's not how it works.

Frame gen is NOT free. It's not a 2x on performance. If he has 100 FPS with FG on, he should have around 43 FPS real fps. Which is infact, far far below 60 FPS. And this is not 1080p, it is 720p internally because it is using DLSS Quality. The consoles are likely upscaling 900p to 1440p, or in the best case 1080p to 4K.

The PC optimization in this title is dogshit, don't try to defend it.
never said that PC optimization is good
 
Please, I'm not talking about resolution here. Obviously DLSS at 1080p will look sharper than whatever the Series S is doing. That is not the point.

The point is that graphics effect such as grass density or the lighting will be significantly reduced on PC low compared to Series S. So you get worse graphics than that console.
RsgBvYxjyk5tm0ZV.png


fRvjEgVazWbeSFTE.png


Below is PC low. Above Series S.

Series S has much more grass and much better lighting. PC Low looks like a PS2 game.
i ll give you that grass density is way low on PC low, but c'mon, who dafuq ll run this game on lowest having a equivalent console PC ???

Also, your PC screenshot looks like have volumetric fog disabled on the .ini files and its not even from the same place, try to be a little more honest the next time.

This is what low really looks like on PC.



 
Last edited:
Thats like saying PS3's Cell processor was not a piece of shit hardware, you just needed really good devs to take advantage of it. Dedication yo.

The best tools are those that make difficult things easy, I don't think this is a revolutionary concept.
The mistake you make is that with the latest unreal engine. Porting games and dev time is piss easy.. making them more cash.

But the downside is quality going down hill.
But eh fix it later fuck em
 
You do by defending this trash port.
not at all, my point was that you said that a pc equivalent console would run the game far far below 60 fps and thats not true.

and yes, i would prefer to play at potato graphics at 60fps than on sub30fps on better graphics, ffs this is a fast fps game
 
Last edited:
The RTX5090 has $2000 msrp and it's actually available at that price right now. I know that's still expensive, but it's nowhere near $5000 as you said.


ym18jbW2roZdM6ru.jpg


Moreover, the RTX 5090 certainly isn't struggling to run Borderlands 4 at 1080p60fps, as my 4080S can already achieve a locked 60fps with Badass (maxed out) settings at 1080p. I'm guessing you watched YouTubers who showed you Borderlands 4 performance while the game was still compiling its shades in the background. When I first started playing the game, the performance was much worse for around 5–10 minutes, but soon I was getting 95 fps instead of 45 fps in exactly the same place. Even if you just change the graphics settings you need to wait few minutes before performance will improve.


Here's 1080p DLAA (native) with badass (maxed out) settings on my RTX4080S. Keep in mind the RTX5090 is twice as fast (and two times as power hungry and three times as expenisve :P).


1080p-1.jpg


2.jpg


3.jpg


4.jpg



At 1440p with DLAA (native) and high settings, the game runs below 60 fps on my PC. However, I think the 5090 could handle these settings and maintain a consistent 60 fps, because even my 4080S isnt that far from that target,


1440p-DLAA-badass.jpg


1440p-DLAA-badass-2.jpg



With DLSS-Quality Borderlands 4 already runs at well over 60fps at 1440p and badass settings.


1440p-DLSSQ-badass-2.jpg


1440p-DLSSQ-badass.jpg



With FGx2 on top of that the game runs at high refreshrste and game feels smooth as butter. I measured 32-36 ms with FGx2. Without FGx2 latency was 28-32ms, so not big difference and you get much smoother and sharper image during motion. IMO DLSS FGx2 is a must in this game.

1440p-DLSSQ-4.jpg


1440p-DLSSQ-5.jpg

Borderlands4-2025-09-14-11-29-28-697.jpg


Here's comparison between badass vs high vs medium settings at 1440p DLSSQ + FGx2:


On my old 2560x1440 LCD I would probably play with badass settings, but on 4K OLED monitor I want sharper image, so I need to lower some settings.

I need to use medium settings at to get similar framerate at 4K DLSSQ (75-90fps)


With FGx2 I get around 130-150fps


With high settings preset and DLSSPerformance I get the same framerate ad medium settings with DLSSQ.


Here's 4K DLSS Ultra Performance with high settings and FGx2 (170-180fps).


4K Ultra Performance with Medium Settings and FGx2 190fps. I wouldn't be surprised if the PS5's image quality looked worse than that. Perhaps a kind PS5 owner will share a screenshot taken in exactly the same spot for a proper comparison. I didn't buy that awesome console, so I can't post my own comparison.


I dont want to defend Borderlands 4, becasue this game is much more demanding as a typical UE5 game and in my opinion, the graphics do not justify the requirements. Some of the assets in the game, especially the textures and trees reminds me Xbox Classic games, so not even X360.


The lighting can looks stunning at times, but I think it would be possible to achieve similar look with much lower cost. Borderlands 4 is almost as demanding as PT games and that shouldn't be the case, because "lumen" supposed to be faster than standard RT and that was the whole point of it. However, thanks to the AI, even this demanding game is perfectly playable on PC. I'm totally convinced that I'm playing at 4K with 130–140 fps, and that's what matters to me. Sure, 4K DLAA (native) and real 240fps would be even better, but it wouldnt change my experience that much. I paid for Tensor Cores and I dont mind using the AI technology that my graphics card provides.
Please don't bother quoting me and writing these long posts, I won't read them anyway
 
No that's not how it works.

Frame gen is NOT free. It's not a 2x on performance. If he has 100 FPS with FG on, he should have around 43 FPS real fps. Which is infact, far far below 60 FPS. And this is not 1080p, it is 720p internally because it is using DLSS Quality. The consoles are likely upscaling something like 820p to 1440p, or in the best case 1080p to 4K.

The PC optimization in this title is dogshit, don't try to defend it.
Yes, FGx2 is not free. In the best case scenario FGx2 boost framerate by 82%, so if Borderlands 4 was running at 100fps on the RTX4060 the base framerate was probably 55fps with latency somewhere around 35-40ms.
 
Last edited:
I'm getting 60fps on RTX 5080, DLSS Quality, 4K, FG off

I'll take some screenshots later
 
Last edited:
No that's not how it works.

Frame gen is NOT free. It's not a 2x on performance. If he has 100 FPS with FG on, he should have around 43 FPS real fps. Which is infact, far far below 60 FPS. And this is not 1080p, it is 720p internally because it is using DLSS Quality. The consoles are likely upscaling something like 820p to 1440p, or in the best case 1080p to 4K.

The PC optimization in this title is dogshit, don't try to defend it.
Frame gen is 2x so if you are getting 100fps with it on, your real frame rate is exactly half as every second frame is generated, so in this case it will be 50fps. But frame gen has a performance penalty from enabling it so it should be higher with it disabled. And it is:



As for the PS5 it is almost certainly sub 1080p, and it drops to the low 40fps range as well.
 
Please don't bother quoting me and writing these long posts, I won't read them anyway
You don't have to read my posts, but others will see what a liar you are. Nothing you said is true and I proved it.

Just a while ago, you told lies about MGSD, such as claiming that my 4080S must run MGSD at 720p to achieve 60 fps or that 4K DLSS is equivalent to playing at 1080p. You're repeating false narrative yet again, this time in another thread.
 
Last edited:
Frame gen is 2x so if you are getting 100fps with it on, your real frame rate is exactly half as every second frame is generated, so in this case it will be 50fps. But frame gen has a performance penalty from enabling it so it should be higher with it disabled. And it is:



As for the PS5 it is almost certainly sub 1080p, and it drops to the low 40fps range as well.

some pixel count point to 850p on Performance and 1300p on Quality, but i m waiting for DF confirmation.

Series S is 720-600p
 
they dont need to waste money and time on optimization when they have DLSS+FG to fake great performance for us.
That is kind of the problem. They are starting to lean on that stuff now. It happened with Monster Hunter Wilds.

I am installing the game on my laptop now. Let's see if it will blow it up.
 
Last edited:
People expecting console version too be good based on PC performance are deluding themselves, prepeare for some low resoluions...



Game is very AMD friendly on PC, also - people "You just need to OC that 5080 to match 4090 performance!"

WAaGe1qh3PkQTyuu.jpg


Hahaha...
 
That is kind of the problem. They are starting to lean on that stuff now. It happened with Monster Hunter Wilds.

I am installing the game on my laptop now. Let's see if it will blow it up.
Monster Hunter had 1M+ CCU
Borderlands 4 is at 290K CCU

There's nothing stopping this train as long as we keep buying these games.
 
Last edited:
No that's not how it works.

Frame gen is NOT free. It's not a 2x on performance. If he has 100 FPS with FG on, he should have around 43 FPS real fps. Which is infact, far far below 60 FPS. And this is not 1080p, it is 720p internally because it is using DLSS Quality. The consoles are likely upscaling something like 820p to 1440p, or in the best case 1080p to 4K.

The PC optimization in this title is dogshit, don't try to defend it.
What? No, it's the opposite. Frame gen doesn't result in a multiplier over two times. If with frame gen on you have 100fps, it typically means you have around 55-65 with it off. It usually increases fps by about 80%.



Here for instance, he gets 50-60fps with frame gen off, but with frame gen on, he gets 95-105fps.
 
Last edited:
Top Bottom