if Senua's Saga: Hellblade II is actually what we can expect for next gen consoles does that mean rtx2080 will be outdade by 2020

Anyone can show me a game on pc that is todays version of crysis?
Not ray trace taki shit in control or bf5 but that has the omfg effect of crysis from back then.
Anyone?

The only time i felt i am in the future was when i saw crysis and when i saw tomb raider on a voodoo 3dfx the first time coming from psone.

You can have your 2080 s and 3090 s and can wipe your ass with it if the only benefit you get is same console game 4k at 100 whatever fps .
The good old days when we looked at pc gaming as god are over . Duke 3d, quake, unreal, tie fighter .....man the list is long...these experiences are gone.

I game perferably on my X already and I have a 1080TI and 4K PC monitor. The difference in real world just isn't that major and you don't have to deal with any hackers so it's an automatic win. Games like Forza optimised for the X etc are near identical unless you're stopping gameplay and comparing foliage.
 
I game perferably on my X already and I have a 1080TI and 4K PC monitor. The difference in real world just isn't that major and you don't have to deal with any hackers so it's an automatic win. Games like Forza optimised for the X etc are near identical unless you're stopping gameplay and comparing foliage.
Exactly.

No one in their right mind would ever argue that pc will always be worlds ahead in power but that extra power doesnt translate into anything in real life.
 
Exactly.

No one in their right mind would ever argue that pc will always be worlds ahead in power but that extra power doesnt translate into anything in real life.

I get that. I mean, I played Gears 5 on my X1X, and I fail to see how much better it could look on the PC.
 
PC vs Console is the dumbest. Everyone is arguing from different points, mods, cost, "it just works" etc.


to be honest I want to see this game running at that fidelity at a good framerate in gameplay at release before I get too "console vs PC"
 
I get that. I mean, I played Gears 5 on my X1X, and I fail to see how much better it could look on the PC.
21:9 + gsync @120 fps
Imo 21:9 is way more immersive. Esp when it's right in front of your face.

Then again dolby atmos+hdr @4k on a huuge screen is also great.

I dunno i like both tbh
 
Yes they will and since consoles are a closed platform it'll have around twice the performance compared to a pc part as per Carmack.

EDit:

u37rxN2.png


Edit: @nkarafo laughing why?

That's a quote from 2011.
 
Phil flat out said that the trailer was in-engine, not CGI movie.

"In engine" is a nebulous term. The thing could have been running at 1 frame a minute generating that video and it still would be "in engine".

Whats featured in this video will not represent the final product.
 
"In engine" is a nebulous term. The thing could have been running at 1 frame a minute generating that video and it still would be "in engine".

Whats featured in this video will not represent the final product.

Not so sure about that, especially after booting up the first Hellblade on my X-Box One X … it looks phenomenal.
 
Phil flat out said that the trailer was in-engine, not CGI movie.
Did everyone forget the infamous "in engine" Uncharted 4 trailer from 2014? Remember how Sony and Naughty Dog claimed that this is what the final game would look like, and that it would run at 60 FPS? Needless to say, Uncharted 4 ended up looking nothing like it, and it didn't run at 60 FPS, either. "In engine" doesn't mean shit. If this was running in real time on the actual console they would have straight up said it.
 
Last edited:
Did everyone forget the infamous "in engine" Uncharted 4 trailer from 2014? Remember how Sony and Naughty Dog claimed that this is what the final game would look like, and that it would run at 60 FPS? Needless to say, Uncharted 4 ended up looking nothing like it, and it didn't run at 60 FPS, either. "In engine" doesn't mean shit. If this was running in real time on the actual console they would have straight up said it.
Uncharted 4 multiplayer is 60fps and im sorry to dissapoint you but uncharted 4 did look like trailer you can literally google the chase scene for example its more phenomenal than the trailer!
 
Uncharted 4 multiplayer is 60fps and im sorry to dissapoint you but uncharted 4 did look like trailer you can literally google the chase scene for example its more phenomenal than the trailer!
Uncharted 4 multiplayer also looks significantly worse than the single player, for obvious reasons.

Look, I've played Uncharted 4 and it's a pretty game, but at no point during actual gameplay do the player models look as good as Drake's does there, and at no point does the game run at 60 FPS like Sony and Naughty Dog claimed when they released that trailer.
 
Uncharted 4 multiplayer also looks significantly worse than the single player, for obvious reasons.

Look, I've played Uncharted 4 and it's a pretty game, but at no point during actual gameplay do the player models look as good as Drake's does there, and at no point does the game run at 60 FPS like Sony and Naughty Dog claimed when they released that trailer.
I have no idea what uncharted 4 you played maybe ur just hating and being thick, uncharted 4 looks like trailer in so many scenes infact one of the best looking games this gen the tech used there is better than most pc games, it runs at 30fps with motion blur so does look like 60
 
I have no idea what uncharted 4 you played maybe ur just hating and being thick, uncharted 4 looks like trailer in so many scenes infact one of the best looking games this gen the tech used there is better than most pc games, it runs at 30fps with motion blur so does look like 60
If you honestly think motion blur can make 30 FPS look like 60, then you need to get your eyes examined. Digital Foundry did a video a few years ago where they recorded some Uncharted 4 footage at half speed and then sped it up to get true 60 FPS gameplay, and the difference was night and day.

 
If you honestly think motion blur can make 30 FPS look like 60, then you need to get your eyes examined. Digital Foundry did a video a few years ago where they recorded some Uncharted 4 footage at half speed and then sped it up to get true 60 FPS gameplay, and the difference was night and day.


I didnt say 30 can be 60 i said the motion blur made it look 60 its smoother than other 30fps games!
 
Phil flat out said that the trailer was in-engine, not CGI movie.
In engine doesn't mean in REAL TIME gameplay, the engine is capable of displaying that is one thing and what the game would run on console is another thing.
Uncharted 4 multiplayer is 60fps and im sorry to dissapoint you but uncharted 4 did look like trailer you can literally google the chase scene for example its more phenomenal than the trailer!


sure does.
I have no idea what uncharted 4 you played maybe ur just hating and being thick, uncharted 4 looks like trailer in so many scenes infact one of the best looking games this gen the tech used there is better than most pc games, it runs at 30fps with motion blur so does look like 60
Maybe there is a chinese version of Uncharted that everyone played expect you, you got the original copy.
Motion blur ........
I didnt say 30 can be 60 i said the motion blur made it look 60 its smoother than other 30fps games!
Stop it !
 
In engine doesn't mean in REAL TIME gameplay, the engine is capable of displaying that is one thing and what the game would run on console is another thing.



sure does.

Maybe there is a chinese version of Uncharted that everyone played expect you, you got the original copy.
Motion blur ........

Stop it !

Mate your clearly being a thick hater! You can hate it or love it drinknpoison if u have to but uncharted 4 does look like the trailer so have at it!
 
Mate your clearly being a thick hater! You can hate it or love it drinknpoison if u have to but uncharted 4 does look like the trailer so have at it!
If you think that this trailer i posted look like the final game to you then you need some prescription mate.
Pointing out downgrades doesn't make me a hater.
Mate.
But you know what, motion blur made it look like 60 right ?
it runs at 30fps with motion blur so does look like 60
Right !
 
Well since the 3080ti will be coming out in 2020 then yes, the 2080 will be outdated in 2020. The consoles will have no bearing on that though.

What an odd question.

edit: the 3080ti will blow consoles out of the water so all of you console only folks should really keep it in your pants.
That says a lot about gaming in general. Consoles are obsolete on day one, sure, but PC's aren't any better because cards will be outdated every 2 years.

Just saying.
 
Developers using footage and claiming it will be in the final product but than its not is very disturbing yes and to the OP's question, I don't think today's PC's will be set-back by Hellblade 2.
 
That says a lot about gaming in general. Consoles are obsolete on day one, sure, but PC's aren't any better because cards will be outdated every 2 years.

Just saying.

Having up to date absolute amazing graphics with great performance doesnt turn out to be obsolete out of a sudden Just because a 5x times more expensive than a whole machine that is gointlg to be purchsed by 1% of the gaming audiance was Just officcialy released. This mentality os pathetic tbh.

Just imagine Xbox one x running HELLBLADE 2 like that at a store with people amazed by it ALL the while theres gaffer nerd there saying loud and clear its obsolete because the graphics card 30XXTI that no normal person ever Heard of was released. People around Will find him retard wich is Fair.
 
Last edited:
Having up to date absolute amazing graphics with great performance doesnt turn out to be obsolete out of a sudden Just because a 5x times more expensive than a whole machine that is gointlg to be purchsed by 1% of the gaming audiance was Just officcialy released. This mentality os pathetic tbh.

Just imagine Xbox one x running HELLBLADE 2 like that at a store with people amazed by it ALL the while theres gaffer nerd there saying loud and clear its obsolete because the graphics card 30XXTI that no normal person ever Heard of was released. People around Will find him retard wich is Fair.
I'm just saying that it's not as fair as you say to criticize consoles for being obsolete at launch but brag about PC, in some ways it's total hypocrisy.

It's not retarded, either, but either way I still PC game, I just don't find it necessary to cry about consoles.
 
Well since the 3080ti will be coming out in 2020 then yes, the 2080 will be outdated in 2020. The consoles will have no bearing on that though.

What an odd question.

edit: the 3080ti will blow consoles out of the water so all of you console only folks should really keep it in your pants.

Since the PS4 gen, console's launched with already outdated power. And the gap will only increase, especially when devs star using Ray Tracing more.

New consoles gen increase actually the baseline of the graphics, so graphics like Control are more common.
 
Last edited:
I have no idea what uncharted 4 you played maybe ur just hating and being thick, uncharted 4 looks like trailer in so many scenes infact one of the best looking games this gen the tech used there is better than most pc games, it runs at 30fps with motion blur so does look like 60

False.

UC4 looks like that early cinematic compared to the cinematics of the released game. Absolutely agree 100%. But the gameplay doesn't even come close. It won't run like that on a PS4 while playing the game.

And the tech used in UC4 is better than most PC game tech? What do you mean? It uses the same pre-baked pipeline that most other games use.
 
Last edited:
If anything, PS5 and XSX (is that the right acronym?) will be behind compared to PS4 and XBO at launch. Jaguar and GCN didn't come out until 2013, the year both consoles were released. The consoles releasing in 2020 will be using Ryzen 2 CPUs and RDNA GPUs, both of which released in 2019. By the time they're out both Ryzen 4000 and Nvidia's new 3000 series cards will have been out for months.

No one was happy about the jag back then when the specs were leaked, and the opposite is the case for zen2. The jag is a tablet cpu, regardless if it was released on the same day as XB1, doesnt matter. XSX is ahead of were XB1 was at launch, even at 10tf.
 
Last edited:
Yes, you cant match all settings perfectly on PC with xbox x version, but the most expensive settings are already on low or medium, and therefore you cant expect drastic performance difference from there.

RX 580 on PC runs RDR2 with similar settings in 4K at around 20fps, and you need over 30fps (around 35fps average) in order to play at locked 30fps. It's possible to gain additional 3-5 fps with further tweaks, but then game looks worse than xbox x version.

RDR2 is PC port, but games on PS4P also shows better TFLOPS architecture efficiency on xbox x. With only 40% TFLOPS more, xbox x can render 2x more pixels compared to PS4P GPU. It's a proof MS has customized their GPU with amazing results and at this point I dont even like to compare xbox x GPU to standard RX 580.

I expext MS will also prepare some specific GPU optimizations for xbox SX, so their 12TF console will be even faster than numbers suggest.

If the most expensive settings are already on xbox one X on higher settings and the xbox one X has no problem running it at 30 fps then why even bother putting it on lower then low settings to start with.
Because it can't run it.


Also that 580 is running it at ~28 fps not 20 fps at 4k with higher settings then xbox one x. And its considered a mess of a launch on a PC. So your numbers are wrong and that's also why your conclusion is wrong because of it.

The fact that we know that PC RDR2 has higher settings out of the gate then xbox one X + we can even verify that relative easy ourselves makes that game a utter joke for comparison on what hardware performs like in comparable situations. Because maybe they even nuked more stuff in the game we don't know about that we can't see like lower tessellation in games in buildings that we don't see aka crysis idea. Which we could also nuke down on rdr2 to get a leap of performance gain. We could even make specific area's less complex until the houses are boxes.

Pretty sure PS4 pro uses 470 and xbox one x uses a 580 which is a more optimized architecture over the 400 series, which explains your other point as its a different generation card with improved architectures.

You are making another weak analogy, 3gb were able to do it for a few years because consoles didn´t have good bandwidth to push memory anyway. If you are happy with the 2080ti getting 3fps more than the ps5 at 4k, good for you.

Yea bandwidth is where consoles fell flat on the PS4, not the dog shit CPU, and terrible GPU solution. Oh by the way didn't you hear its now on a single soc which will make it it a super PC architecture and compete with high end hardware on that front with more v-ram then PC's have and 8 cores more then PC's have and that all for only 400 bucks. While a guy with a 3 year old i7 could slam a new GPU in his box for 300 bucks and get a PS4 pro already but with 60 fps years before it even releases? its laughable at best.

Sorry mate, i have been around already for a while and the PR shit those console company's are pushing is beyond laughable at best. Massive respect for phil spencer to not walk into this hype bullshit again ( remember power of the cloud ) and just straight up give us the data we want and say what the performance is like and be realistic.

Hell even red dead redemption 2 people where convinced that the xbox one X was running stuff at ultra settings before the settings came out on PC to start chanting there victory's. Men i remember when xbox one x got announced and everybody and there mom was announcing that box to be as fast as a 1080. a card that's what 20% slower then a 5700xt which phil says is 100% faster then xbox one x. Yea there you go. Stop believing the hype start believing actual numbers because that's what it pushes at the end of the day.

2080ti sits at ~35% above the 5700xt performance wise, 35% = a generation leap. So your idea of 2080ti only pushes 3 fps higher more forwards on 4k, that means the PS5 only runs the game at 9 fps while the PC sits at 13.5 fps aka both are unplayable.

And 35-40% on if you want to hit 60 fps on 4k = the difference between 40 and 60 fps any % more for 4k matters let alone 35-40%. If you don't think this is a big upgrade or the concept of it u probably never played or owned higher end systems or the market for it. People spend bucket loads of money for 30% increase in performance any day.

The only way to make the 2080ti obselete is when they push a next generation amd gpu in that box that performs faster then a 2080ti and AMD has nothing like that at this point and frankly microsoft will just release Xbox series X1 a year later with that gpu in it if they cared for it, so what's the point.

Also PC gamers do not sit at 4k which will give that 2080ti even longer legs then those consoles have unless they start dropping resolutions and they will u can count on that straight out of the gate.

The reason i say death stranding is because its a visually impressive late generation game that runs on a console based engine. It should give a e much better picture then some of the other games that have been compared.

Death stranding is not interesting. They could slam a PC version out with 30x the draw distance at low settings and only renderable on a 16 core ryzen with 2x 2080ti's. That doesn't make the original ps4 more faster then 30 tflops PC with double the cores.

If it needs higher requirements then the reasons are:

1) focused on this day technology and looks at what PC gamers have and build there version around it. ( which mostly happens ) Which ups the base quality of the game.
2) dog shit optimized. ( aka skyrim in the PS3 area ). Which high likely is going to be the case if that engine never pushed PC titles out in there life. And no optimisation isn't a issue on PC as pretty much 90% ( which are a metric ton more pc games that release ) have zero issues with optimisations. It aint the 2000's anymore.

Then also there is this thing where hardware fluxuates on performance, nvidia and amd will never have a equal performance steady output on every game because they are different cards with different drivers that do things differently. one game they can push as far as 50% away from eachother another game its only 15%.. That's why you always look at averages of multiple games and not single any game specifically out because that would be utterly pointless.

This is why death stranding performance means nothing. It only means something towards people that want to push a agenda or hope to reinforce there idea that Console optimisation is still a thing in this day of age. While consoles start to lose more and more performance through locking hardware down and using the same tools as PC platform.

Optimisation means nothing more then lowering settings in this day of ages. PC can optimize there games all day long and far further then consoles ever will be. I can make RDR2 run on 2 core potato CPU with integrated GPU settings at 30 fps looking like this for 48 bucks. That setup is 4x slower then the base PS4.


Or the same setup and control that runs at double the PS4 framerate:

5b6d87a52d3abd0c3c5143e82fcf7078.png



Shitty youtube screenshot so it looks blurry but its actually sharp looking if you play it. Also resolution can be pushed down big time somethign consoles also can't be adjusted besides some very basic settings.

Etc etc.
 
Last edited:
I built a new rig with a 3900X, 64 GB RAM, PCIe Samsung 1 TB SSD, and I'm still using my 2+ year old 1080ti. Outside of needing a few settings low with the GPU, I'll be speced considerably ahead of consoles that don't even release until next year. Once there is a GPU that is 2x across the board minimum over the 1080 ti I'll get that.
 
If the most expensive settings are already on xbox one X on higher settings and the xbox one X has no problem running it at 30 fps then why even bother putting it on lower then low settings to start with.
Because it can't run it.


Also that 580 is running it at ~28 fps not 20 fps at 4k with higher settings then xbox one x. And its considered a mess of a launch on a PC. So your numbers are wrong and that's also why your conclusion is wrong because of it.
Well, if you want exact numbers, here's Digital Foundry test.


4K - 22fps min, and 25fps average on RX 580 with xbox x settings. Of course these are not 100% exact xbox x settings, but should be close enough to the point eventual settings differences between xbox x and PC shouldn't change performance that much. Xbox x version runs at solid 30fps, so obviously average fps must be even higher (something like 35fps) in order to sustain 30fps lock for the majority of time. So RX 580 would need 10fps more in order to match xbox x experience and I dont believe is realistic to expect so much more performance when game already runs at almost minimal settings.

When it comes to PS4P, RX470 is still polaris based. There's not much difference between RX 480 and RX 580, and if you see xbox x pushing 2x many pixels as PS4P is only because MS did their own optimizations. Here's long list

Most important thing is, xbox x GPU has more bandwidth compared to RX580 ( 256.0 GB /s on 580 vs 326.4 GB / s on xbox x) and that alone should result in better performance at higher resolutions.

I know DF likes to use RX 580 in their xbox x performance comparisons, but it's not the same GPU and people should keep that in mind, especially when they are trying to guess what it takes to match 2x xbox x GPU performance.
 
wonder how much pc specs will jump for games when next gen appear. like how much it would cost then to have same level pc with console level on same games, they all will run 4k
 
Don't expect that in-engine cutscene to be the standard for the next gen....

Also Digital Foundry puts those new consoles at 5700XT levels of power. 2080 is still way more powerful.
 
Last edited:
Don't expect that in-engine cutscene to be the standard for the next gen....

Also Digital Foundry puts those new consoles at 5700XT levels of power. 2080 is still way more powerful.

But, if these consoles aren't cheaper than a 2080 they are going to be dead in the water!

Its never been about pure power, its about as much power as possible at a price-point that is affordable by the masses.

If a high-end PC isn't more capable than a console then something has gone very wrong!

The key thing is how fit-for-purpose the hardware is, and I think its fair to say that despite all the ridiculous wailing about the use of Jaguar cores, the current gen has proven itself very capable and worthy. There are plenty of truly visually stunning titles out there running on hardware that wasn't top-of-the-line when it first appeared in consoles 6 years ago.

The next gen of hardware looks to be a considerable step up on that, which is nice, but objectively if you look at how the little games have evolved since the start of the 360/PS3 era, you really have to wonder how reflective that power is going to be on the actual gaming experience.

The Switch is a great example of how convenience and form-factor is just as strong a selling-point as computational grunt, the movement towards hardware agnostic streaming solutions is another.

The worry shouldn't be that consoles aren't going to overtake PC, it should be that they become sufficiently good and versatile alternatives to a high-end gaming PC, but at a much lower price, that they cut into that enthusiast market.
 
Last edited:
Yes they will and since consoles are a closed platform it'll have around twice the performance compared to a pc part as per Carmack.

EDit:

u37rxN2.png


Edit: @nkarafo laughing why?
There is optimisation to be had on a consoles hardware if the time is taken (exclusive games).
But overall say in multiplat games a og ps4 isn't outperfing a 7870 desktop gpu.
 
Last edited:
I


I agree with your post and it is well written, however this one bit I have to rebut.

Jaguar CPU is absolute trash.

How much has it really hurt though? I'd argue that its relative weakness has had a positive impact in that its encouraged developers to use GPU compute for tasks that under other circumstance would have been handled by the CPU.
 
Top Bottom