is PS5 GPU is slightly better than gtx 1080 thats pretty pathetic.

You can have multiple different builds for the different machines...but if they were doing that unpatched base XBOX One games would not work on the XBOX One X...
You're right, the way APIs work isn't by magic, it's well documented. You provide a standardized layer so that you do not need to patch the game and the GPU vendor can make any necessary changes in the driver instead. This is why ancient as fuck games and ancient as fuck versions of DirectX, say DX8, work on a 2080 Ti, but ancient as fuck drivers do not. The driver is the hardware specific part of the chain. The point of the API is not to make hardware a complete non-factor but to ensure the game itself is hardware agnostic. If you ensure the game is hardware agnostic then you do not need the source code to make it run on newer hardware that did not exist at the time, you can modify the driver instead.
True, and very well explained.

But the things change when you want to get most performance possible. APIs can act as a layer between code and driver, but that does not mean you cant by pass it when necessary, or when needed.

Sure third parties/indie/low budget wont care. But first party showcases? Theres metal for sure.
 
Last edited:
So to be clear, we're discussing specs that are just leaked, never confirmed, on a product that has no outlined timeline, is in constant development and contract negotiations, and talking about it like we know best? Right
 
This isn't that complicated. Xbox was a more modern, more powerful machine, yes. It was also not a balanced piece of hardware like GameCube.
I dont want to offtop about GC and xbox in PS5 thread, so lets continue our discussion here
 
Last edited:
So to be clear, we're discussing specs that are just leaked, never confirmed, on a product that has no outlined timeline, is in constant development and contract negotiations, and talking about it like we know best? Right

That's right, we're discussing specs that are based on various (unproven) leaks because, well, it's fun. What's the problem with that?

Also, people suspect 1080ti power because of that Twitter comment about the GPU doing upwards of 20,000 in the Fire Strike 3DMark test. That's what the 1080ti scores.

That card is about 11TF, which we've been told the PS5 is capable of.

It might all turn out to be total bullshit at the very end of the day, but, again, it's fun to read the leaks and speculate.
 
Last edited:
Currently you have 4 different GPU architectures out there - Maxwell/Pascal, Turing, GCN and RDNA just with different power levels. I think that at least in GPU space, things are pretty clear, devs don't optimize for older hardware and don't give a fuck about Intel gpus for the most part. With CPU you have Core and Ryzen and DDR3 and 4 for memory (only difference is speed here).

There are also 5 different consoles on the market: Switch (Nvidia/Maxwell), X1 (GCN 1.0, SRAM), PS4 (GCN 1.1), PS4 Pro (GCN 1.4+) and X1X (GCN 1.4) so these are not the same good times for developers when like when they were 2/3 machines to code for :)
You forget the fact that all the above machines use a shader-based GPU architecture.

It's nothing like the OG XBOX (shaders) vs GC (TEV) vs PS2 (Crazy Ken architecture).

So yeah, it's easier these days, because gaming machines are more homogenized than ever before. :)
 
who are "we"? Because 100% guaranteed that 499 is really pushing it for the masses.

Anything above that and they done f*ck up, both Sony and MS.


At what point is $499 okay then? Are we gonna be in the year 2050 still demanding $399 consoles with decent hardware? It's just ridiculous to ask them to crap anything halfway decent into a $399 box in 2020 WITH a 4k Bluray player and SSD. You're talking at least $120(extremely modest estimate) manufacturing cost added on top of PS4's 2013 cost AND inflation since then. And PS4 barely sold in at a profit. Inflation is a real thing, Sony doesn't get to build consoles at the some fixed price locked in 10 years ago. Worker pay, warehouse, shipping, all that shit has gone way up.
Not to mention the CPU is more expensive compared to the timeframe 2020 vs 2013 and that's just assuming the ps5 doesn't need a more expensive cooling solution like X1X had because of the massive increase in power.
 
Last edited:
True, and very well explained.

But the things change when you want to get most performance possible. APIs can act as a layer between code and driver, but that does not mean you cant by pass it when necessary, or when needed.

Sure third parties/indie/low budget wont care. But first party showcases? Theres metal for sure.

This is a myth, nobody is writing games in assembly anymore, and it'd be silly to skip the APIs, might as well not have any. AFAIK the APIs themselves have gotten much better, such as Vulcan that allows the developer to communicate with the HW in a lower abstraction layer.

What console developers have is a very intimate knowledge and analysis tools for the HW resources they are working with and can tailor make the application to make sure the resources they have are being maximized, that's it.

That's it, there's no "to the metallllll!!! ConsoleZ FOREVEA!!" black magic going on.
 
Writing CPU assembly is still possible, but it's not "code to metal" per se, especially for CISC/x86 CPUs (you don't get access to RISC micro-ops or internal registers used by register renaming).

It's also possible to write AMD GCN assembly (which is actually RISC-y), but I don't know if console platform holders allow it these days (since it could complicate BC for next-gen consoles):

 
At what point is $499 okay then? Are we gonna be in the year 2050 still demanding $399 consoles with decent hardware? It's just ridiculous to ask them to crap anything halfway decent into a $399 box in 2020 WITH a 4k Bluray player and SSD. You're talking at least $120(extremely modest estimate) manufacturing cost added on top of PS4's 2013 cost AND inflation since then. And PS4 barely sold in at a profit. Inflation is a real thing, Sony doesn't get to build consoles at the some fixed price locked in 10 years ago. Worker pay, warehouse, shipping, all that shit has gone way up.
Not to mention the CPU is more expensive compared to the timeframe 2020 vs 2013 and that's just assuming the ps5 doesn't need a more expensive cooling solution like X1X had because of the massive increase in power.
Most of us are aware of that.
But the masses see the prices and say "oh my, that's expensive".

I'm not saying 499 is bad pricing, I'm saying something like 599 is gonna be an issue early on next-gen.
 
PS2 doesn't have any OS running in the background, all avaible resources are used by games. Windows XP on PC requires 64MB of RAM (with 128MB recommended), current gen consoles both eat up to 3GB of RAM for their operating systems.
The ram available for the GPU in ps2 was even lower than 32mb
 
Most of us are aware of that.
But the masses see the prices and say "oh my, that's expensive".

I'm not saying 499 is bad pricing, I'm saying something like 599 is gonna be an issue early on next-gen.
The masses should be taught economics, preferably from 1st grade. It's not a problem that companies need to solve, it's a (public) education problem that governments need to take care of.

For example, $299 was the golden price point for the PS1/PS2 era, but not for the PS4 era, therefore they had to bump it to $399. Should we have stayed to $299 forever?

$499 will be acceptable for the PS5 in 2020 and even $599 will be acceptable for the PS6 in 2027. That's how inflation works.

Either way, modern consoles are cheaper than ever before:

199q245cirikujpg.jpg


 
The masses should be taught economics, preferably from 1st grade. It's not a problem that companies need to solve, it's a (public) education problem that governments need to take care of.

For example, $299 was the golden price point for the PS1/PS2 era, but not for the PS4 era, therefore they had to bump it to $399. Should we have stayed to $299 forever?

$499 will be acceptable for the PS5 in 2020 and even $599 will be acceptable for the PS6 in 2027. That's how inflation works.

Either way, modern consoles are cheaper than ever before:

199q245cirikujpg.jpg



This would be fine and dandy if people's pay was going up with inflation, but it's not. Wages are stagnant. It's why they're so reluctant to push games up to $69.99.
 
This would be fine and dandy if people's pay was going up with inflation, but it's not. Wages are stagnant.
Fair point, but the 2013 economic climate was much worse than 2020 will be. Recession is mostly over these days.

It's why they're so reluctant to push games up to $69.99.
Not really.

Games can still afford to keep a $60 MSRP because we have digital distribution, subscriptions and season passes/DLC/MTX (it all adds up). :)

Would you be happy if game prices were bumped to $90-100 if they abolished DLC/MTX? I'm sure you wouldn't like it, so $60 it is.
 
So to be clear, we're discussing specs that are just leaked, never confirmed, on a product that has no outlined timeline, is in constant development and contract negotiations, and talking about it like we know best? Right

Nah, you're just in a Slash pedestal honeypot troll thread.
 
Writing CPU assembly is still possible, but it's not "code to metal" per se, especially for CISC/x86 CPUs (you don't get access to RISC micro-ops or internal registers used by register renaming).

It's also possible to write AMD GCN assembly (which is actually RISC-y), but I don't know if console platform holders allow it these days (since it could complicate BC for next-gen consoles):


Thanks! Really interesting. I don't do console development, mostly Java and Angular in a financial company, so I wouldn't come across this on my day to day. What I'm not clear on is if using this would prevent an application from running on a non GCN GPU. If it does, then I can't imagine a lot of developers are using it.
 
Last edited:
What I'm not clear on is if using this would prevent an application from running on a non GCN GPU. If it does, then I can't imagine a lot of developers are using it.
Of course it would prevent it.

CPUs share a common ISA (i386/AMD64) and BC is handled by the microcode (it translates CISC macro-ops to RISC micro-ops), but GPUs are not like that. AMD and nVidia have completely different ISAs and even different GCN iterations have slightly different assembly output.

GPUs have a driver that translates high-level API calls to native machine code. Even DX12/Vulkan is not as low-level as raw assembly.

I don't know if 1st party PS4 exclusives use raw GCN assembly for certain compute-intensive algorithms.
 
they arent far optimized anymore because they are the same parts as pcs. back in the day you had the ps2 CPU running at 300 MHZ but is a better gaming machine than a 1 GHZ pentium 4 machine because it wasnt the same part it was specified for games

now days thats not the case.
Not how software works on consoles m8.
 
That's how APIs work. That's the entire point of their existence, to ensure compatibility at the cost of maximum performance. Console games can't go bare metal anymore. Not unless devs are going to patch games in purpetuity. A game that runs on bare metal on the XBOX One won't work on the XBOX One X for example. Next-gen wouldn't be compatible with any games running on bare metal either.



Adding for the discussion
 
Last edited:

Adding for the discussion

Where this thread should've been posted, But the OP needed his once a month Sony SlashBash bait thread and everyone fell for it.
 
You have to keep in mind consoles will never ever target 60fps. Devs will always favor better looking graphics over performance on consoles. So a 1080 performance level for PS5 is huge specially if it comes at a $500 price.
 
I don't know if 1st party PS4 exclusives use raw GCN assembly for certain compute-intensive algorithms.

You don't need to. Most of the bottlenecks are always in CPU/GPU synchronization and the command buffer build up.
Shader kernels are pretty simple functions usually and can be optimized by the compiler.
It's the same for PC CPUs actually: computations are rarely hand-optimized but the memory access patterns and the flow itself are much more frequent optimization targets.
On PC for example you cannot even control shader compilation steps or command buffer build in the most APIs. And when you can (Vulcan) it's explicitly stated that your code is no longer portable to any other GPU, you will need to ensure portability yourself.
 
Yeah 1080 performance actually would be pretty pathetic. You're talking about a GPU that'll be over 4 years old by the time PS5 releases.

That would be like the PlayStation 3 RSX being equivalent to a Geforce 4 Ti.
 
I could see Switch 2 doing 1080p in like 2022 or 2023 with a good upscaler built in for 4k, but I expect 4k native for ps5 and Xbox.
 
Last edited:
You have to keep in mind consoles will never ever target 60fps. Devs will always favor better looking graphics over performance on consoles. So a 1080 performance level for PS5 is huge specially if it comes at a $500 price.
i wager the new gen will usher in 60fps as the benchmark and 30fps will be remarkably rare. The balance of cpu & GPU will finally allow devs to have their cake and eat it too.
 
i wager the new gen will usher in 60fps as the benchmark and 30fps will be remarkably rare. The balance of cpu & GPU will finally allow devs to have their cake and eat it too.

I'm still skeptical of that. I think they will almost always go for more bells and whistles over fps simply because graphics are more marketable.
 
Yeah 1080 performance actually would be pretty pathetic. You're talking about a GPU that'll be over 4 years old by the time PS5 releases.

That would be like the PlayStation 3 RSX being equivalent to a Geforce 4 Ti.
In PS3 times GPU technology was advancing much faster, the 8800 GTX launched the same year as PS3 and it was like 3-4x faster. These days however even year 3 old GPU like GTX 1080 is still really capable, and even one year from now GTX 1080 will be still considered good enough.
 
Last edited:
In PS3 times GPU technology was advancing much faster, the 8800 GTX launched the same year as PS3 and it was like 3-4x faster. These days however even year 3 old GPU like GTX 1080 is still really capable, and even one year from now GTX 1080 will be still considered good enough.
Not just that, but PCs have also increased their TDP quite a bit (both in the GPU and the CPU department).

Remember 3D accelerators like 3DFX Voodoo and Riva TNT?

The former didn't even have heatsinks and the latter was a small-ish card (kinda like GeForce GT 1030) with a puny heatsink/cooler (akin of low-end cards of today).

Is that a coincidence?

"Powerful" consoles like Nintendo 64 didn't have a triple-digit TDP either, and yet, they were considered powerful back then. Nintendo Switch has a similar TDP (in docked mode) and it's considered "weak".

nasRAjE.png


I don't remember 90s PCs having fancy cooling systems, 700W PSUs and all that jazz.

That's why I don't understand the ridicule of consoles these days by PCMR folks (lol, this meme didn't even exist back in the 90s). Modern consoles have increased their TDP quite a bit (I'm talking about PS/XBOX) to keep up with PCs.

Old-school consoles had no cooling at all and yet, everyone was amazed by them. What gives?
 
Why would that be pathetic?

Take the PS5 out of the equation.

If someone pondered "hey, I'd like to buy a relatively cheap PC in the $400-600 range. I don't need a powerhouse. I just need something that can run most stuff", this is the kind of GPU you'd recommend to them. Futhermore, low/mid-tier GPUs are what the developers are targeting. Sure, you can crank up the settings if you have a more powerful rig, but that GPU from 3 years ago is the average that devs are shooting for.

So what's the difference? Well, the difference is that the [console] offers a guaranteed baseline. On PC, it's a guess. That is why a PS2 with 32MB of RAM (plus 4MB of video RAM) and a 300mhz processor ended up with better-looking games at the end of its life than what a comparable Pentium II + GeForce 256 was pulling off at the end of the same timeframe. I could run Unreal Tournament far better on PC than the PS2 port, but a few years later in 2004 the same setup would've been incapable of running UT 2004. The PS2? Still trucking along.
 
It will be just fine. A PC has to be designed to do a multitude of various things, in thousands and thousands of hardware and software combinations, etc. A console is designed with one os, one set of hardware specs, and only one real purpose. It's highly streamlined at doing it's one job.

Much as I love good graphics, is the game fun to play the biggest thing to me. To me high end graphics don't mean good game. Playing bloodstained now and I love it despite it not being a graphics powerhouse. Just beat sekiro and I loved it on my ps4 pro. Mario kart on my switch is a blast.

Yes I am going to get a ps5 because it will have better hardware, but ultimately I'll buy one because it will be the console with the most exclusives I want to play.
 
PC games aren't designed for "hundreads" different configurations, developers put what they require in minimal settings. No one would make games on pc if that was the case - it's usually last 2 generations of GPUs/CPUs and one or two OSes, if you have older hardware it can run but it's no guarantee

Right now games are optimized for Pascal/Turing, GCN/RDNA, Core 6xxx and up (9 series is still based on Skylake) and Ryzens.
 
The masses should be taught economics, preferably from 1st grade. It's not a problem that companies need to solve, it's a (public) education problem that governments need to take care of.

For example, $299 was the golden price point for the PS1/PS2 era, but not for the PS4 era, therefore they had to bump it to $399. Should we have stayed to $299 forever?

$499 will be acceptable for the PS5 in 2020 and even $599 will be acceptable for the PS6 in 2027. That's how inflation works.

Either way, modern consoles are cheaper than ever before:

199q245cirikujpg.jpg



I love when people like you begin talking about economics like you didn't just read a few articles about "inflation" and time-value-of-money and totally aren't parroting back what you've read from blog sites. Although, if you do really have a degree in economics, I totally retract that.

No, consoles MSRPs should not be following the gold standard as increasing with inflation because they are not a commodity. Economies of manufacting scale has increased, price per performance is decreasing, businesses are making money from a diversified stream not tied solely to hardware. There might be an argument that the new de-facto price point should be $399 but "inflation" is not it, especially when a midrange computer today is probably a quarter of the cost of a midrange computer in 1995 and perhaps half the cost of a midrange computer in 2005.

We're in a time where you can put together an amazingly capable gaming PC for $500, and console manufacturers have just realized they've amassed enough fans to slowly creep taking losses of hardware out of the equation.
 
Not just that, but PCs have also increased their TDP quite a bit (both in the GPU and the CPU department).

Remember 3D accelerators like 3DFX Voodoo and Riva TNT?

The former didn't even have heatsinks and the latter was a small-ish card (kinda like GeForce GT 1030) with a puny heatsink/cooler (akin of low-end cards of today).

Is that a coincidence?

"Powerful" consoles like Nintendo 64 didn't have a triple-digit TDP either, and yet, they were considered powerful back then. Nintendo Switch has a similar TDP (in docked mode) and it's considered "weak".

nasRAjE.png


I don't remember 90s PCs having fancy cooling systems, 700W PSUs and all that jazz.

That's why I don't understand the ridicule of consoles these days by PCMR folks (lol, this meme didn't even exist back in the 90s). Modern consoles have increased their TDP quite a bit (I'm talking about PS/XBOX) to keep up with PCs.

Old-school consoles had no cooling at all and yet, everyone was amazed by them. What gives?
Yes I remember TNT2 times, in fact TNT2 was my first card 😉. I forgot to mention TDP argument, but you are absolutely right, TDP wasnt an issue many years ago, but these days high end GPU's are power hungry beasts and loud as well. My strix 1080ti was loud as jet :P, while my xbox x was always quiet. If consoles will match GTX 1080 performance and will be quiet at the same time as xbox x, then I will not complain.
 
Performance around Vega/1080 was the most possible as this was AMD celling and we knew Navi is going to target that. With new architecture AMD teraflops/real performance metrics have changed so ~9 TF doesn't sound as impressive as 12/13 and that hurts console fanboys ;)
 
Last edited:
Not just that, but PCs have also increased their TDP quite a bit (both in the GPU and the CPU department).

Remember 3D accelerators like 3DFX Voodoo and Riva TNT?

The former didn't even have heatsinks and the latter was a small-ish card (kinda like GeForce GT 1030) with a puny heatsink/cooler (akin of low-end cards of today).

Is that a coincidence?

"Powerful" consoles like Nintendo 64 didn't have a triple-digit TDP either, and yet, they were considered powerful back then. Nintendo Switch has a similar TDP (in docked mode) and it's considered "weak".

nasRAjE.png


I don't remember 90s PCs having fancy cooling systems, 700W PSUs and all that jazz.

That's why I don't understand the ridicule of consoles these days by PCMR folks (lol, this meme didn't even exist back in the 90s). Modern consoles have increased their TDP quite a bit (I'm talking about PS/XBOX) to keep up with PCs.

Old-school consoles had no cooling at all and yet, everyone was amazed by them. What gives?

To answer your question, all you're seeing is a bit of ignorance on the technical side when people make those comments.

As pointed out, the graphics arena has not jumped significantly in the past 3 to 4 years like it did almost a decade ago and prior.

We are reaching the limits of silicon. So either expand the die size, or a new method needs to be developed if it can.
 
Most of us are aware of that.
But the masses see the prices and say "oh my, that's expensive".

I'm not saying 499 is bad pricing, I'm saying something like 599 is gonna be an issue early on next-gen.

Yeah $599 is just out of the question though, if they're going to increase pricing, they already understand that it can't be a $200 jump.
 
Top Bottom