Rumour: PS5 Devkits have released (UPDATE 25th April : 7nm chips moving to mass production)

This ps5 is a turtle. I think it looks more like a slug trying to outrun a fox. But unfortunately you are right. Probably the new generation will be completely devastated by 4k resolution. We will not see significant leap in graphics. 4k consumes a lot of features and even though the PS5 does 4K checkerboard I do not believe there will be a lot of performance to improve the graphics.

Don't you think the next Playstation could bring the same VEGA 64 performance?

2016 PS4 PRO = 4.2Tflops/GTX 1050Ti

2017 Xbox One X = RX580/GTX 1060

Hypotetical 2018 Console = VEGA 56/GTX 1070 performance

Hypotetical 2019 PS5 = VEGA 64/GTX 1080 performance

What do you think?? Even if the PS5 come out with 10Tflops I would say that NAVI performance per Tflops is faster than VEGA. Because I can't believe that SONY will bring us aat turtle console with the same performance of a VEGA 56 10.5Tflops that is equivalent to a GTX 1070. This is ridiculous...

man your post sometimes are just bizarre. good luck running games on PS4pro settings and resolution on a 1050ti. LOOOOOOOOL. i explained that to you before, but hell i will do it again: there is no "TFLOP Advantage" for nividia cards in a console environment. if you build your rendering pipeline in a way that it can tap all available 11 TFLOPs, 11 TFLOPs is 11TFLOPs. so V56 performance would be much more like 1080ti performance in an console environment. there are other bottlenecks that nvidia is solving better than amd (like color compression), but that has nothing to do with computational power...
 
native 4K is a stupid use of resources when you can reach a pretty much indistinguishable result with CB + temporal injection with nearly half the rendering load.

NEITHER MICROSOFT NOR SONY SHOULD STRIVE AFTER NATIVE 4K WHEN THEY WANT TO DELIVER A TRUE GENERATIONAL LEAP
 
Last edited:
man your post sometimes are just bizarre. good luck running games on PS4pro settings and resolution on a 1050ti. LOOOOOOOOL. i explained that to you before, but hell i will do it again: there is no "TFLOP Advantage" for nividia cards in a console environment. if you build your rendering pipeline in a way that it can tap all available 11 TFLOPs, 11 TFLOPs is 11TFLOPs. so V56 performance would be much more like 1080ti performance in an console environment. there are other bottlenecks that nvidia is solving better than amd (like color compression), but that has nothing to do with computational power...

Are you crazy? :messenger_dizzy:

Xbox One X with 6.2Tflops reach the same GTX 1060 that have less Tflops... I don't know if the problem is called performance per Tflops or not. I know that I could be wrong about it. I was just using Tflops as a way to compare the diferences between performance! But you know that in some way AMD GPUs never perform equivalent to Nvidia GPUs. And we've seen that Xbox One X is not equivalent to a GTX 1070 6.5Tflops even with 6.2Tflops.
 
please read again what i wrote. you can't extrapolate pc benchmarks to consoles. and no a gtx1060 does not performe as good as a X1X if you aren't cpu bound (which current gen console sadly mostly are)
 
please read again what i wrote. you can't extrapolate pc benchmarks to consoles. and no a gtx1060 does not performe as good as a X1X if you aren't cpu bound (which current gen console sadly mostly are)

Ok! Do you believe that we'll see an amazing or at least the same generational leap between PS3 to PS4?? Even with 10Tflops?? :messenger_smiling:

 
Last edited:
I understand you... But you need to know that Xbox One X runs 70% of games at Dynamic 4k. 30% at Native 4k(specially Xbox 360 games). The majority of games runs at 30fps. Oh! But Wolfenstein 2, Halo 5 and DOOM 2016 runs at 60fps! Ok... But they use Dynamic resolution as I said. Well, I remember Forza 7 running native 4k all the time at 60fps and Forza Horizon 3 at native 4k 30fps. That's ok... But if the next gen console reach Dynamic 4k 30fps and just a few at 4k 60fps it will not be a kind of downgrade. Because these next gen consoles will be running at least the same Xbox One X resolutions! But with a lot of increases in terms of graphics.

Be honest!!! When you look at the same scene of a game on a native 4k monitor and another dynamic 4k monitor, even at 3200x1800p. Can you see any difference? Especially if the monitor at 1800p do upscaling to 4k. Could you really see any diference?

But despite everything I said. I am not speaking things for myself. But I'm reflecting on the possible specs of the new consoles. If the PS5 comes with the same GTX 1070 / VEGA 56 performance it will be very difficult to believe that it deliver native 4K with substably better graphics than the current generation. I would only believe if this console included dynamic 4k. Or do you think a console with GTX 1070 performance does native 4k 60fps and a big jump in graphics equivalent to what we saw from PS3 to PS4? I would not believe it if it were not through Checkerboard ... Or we'll have to wait until the end of 2020/2021 to get something on the level of a super turbo charged GTX 1080Ti.

On the other hand, lots of a games are running with high textures, custom engines like the one for Horizon show great results, ect. Those 2 consoles are "beast" for actual games, based on low mobile chips, but i see what you mean.

To be honest, i expect all of those beauties for next gen with a majority of 60FPS games, expect with some open worlds. With years on PRO and X, i expect games to both look and run great. Devs are now used to those techs.

To me PRO and X are mostly everything i've heard about 3d since the early PSX days. I's like a dream. Next gen should be the NEO GEO of 3d. Powerful enough to run actual games at max settings and to create their own expériences.

When did we start to get leaks regarding the tech for PS4 and XB1? Were they officially announced? Silence means 2020?
 
On the other hand, lots of a games are running with high textures, custom engines like the one for Horizon show great results, ect. Those 2 consoles are "beast" for actual games, based on low mobile chips, but i see what you mean.

To be honest, i expect all of those beauties for next gen with a majority of 60FPS games, expect with some open worlds. With years on PRO and X, i expect games to both look and run great. Devs are now used to those techs.

To me PRO and X are mostly everything i've heard about 3d since the early PSX days. I's like a dream. Next gen should be the NEO GEO of 3d. Powerful enough to run actual games at max settings and to create their own expériences.

When did we start to get leaks regarding the tech for PS4 and XB1? Were they officially announced? Silence means 2020?

I don't know... But the rumors of PS5 and Xbox Scarlet are bigger than those of PS4 in 2011/2012. Nowdays Microsoft doesn't hide its plans... They spoke about Scarlet and so on. We are approaching. I think that by the end of 2020 these new consoles will already be on store shelves.
 
Last edited:
I'm not concerned regarding resolution.

Look at the last of us 2. That's 1440p and looks incredible.

I really hope that sony continue with efficient rendering as all of their first party games since the pro came out have looked far better than most games running on other platforms at higher resolutions pc included.

The only thing I really wish they would improve is shadow quality.
 
Ok! Do you believe that we'll see an amazing or at least the same generational leap between PS3 to PS4?? Even with 10Tflops??:messenger_smiling:



you know, the thing with tech demos at the beginning (or just before a) of a new generation is, that games tend look like that NEXT NEXT Gen :p but on that video: naughty dog might do something like that in a cutscene next gen. what i really want from next gen is that gameplay looks more like cutscenes. for that to happen we need some decent solution for realtime GI. even if the new consoles will have some dedicated ray tracing Hardware (which i really hope they will), RT-GI will likely be to heavy to be done in a decent manner the next 4 to 5 years even on pc hardware. i hope the ray tracing stuff can also be used to accelerate cheaper GI solutions like voxel cone tracing. RT shadows and reflections are fine, but what we really need to make our PBR materials come to life in not artist controlled lighting situations is a decent realtime GI solution.

what i think you don't understand is that a generational jump comes mostly from developers don't need to be keeping weak last gen hardware like X1 in mind, when they designing their game worlds and assets and not so much just from the jump in hardware capability alone. that's why i think incremental hardware updates like microsoft is discussing them, is a really horrible idea.
 
Last edited:
you know, the thing with tech demos at the beginning (or just before a) of a new generation is, that games tend look like that NEXT NEXT Gen :p but on that video: naughty dog might do something like that in a cutscene next gen. what i really want from next gen is that gameplay looks more like cutscenes. for that to happen we need some decent solution for realtime GI. even if the new consoles will have some dedicated ray tracing Hardware (which i really hope they will), RT-GI will likely be to heavy to be done in a decent manner the next 4 to 5 years even on pc hardware. i hope the ray tracing stuff can also be used to accelerate cheaper GI solutions like voxel cone tracing. RT shadows and reflections are fine, but what we really need to make our PBR materials come to life in not artist controlled lighting situations is a decent realtime GI solution.

what i think you don't understand is that a generational jump comes mostly from developers don't need to be keeping weak last gen hardware like X1 in mind, when they designing their game worlds and assets and not so much just from the jump in hardware capability alone. that's why i think incremental hardware updates like microsoft is discussing them, is a really horrible idea.

Yes! But they can't stop in time. They always need more performance... Anyways... I still wating for 12Tflops.

Are you a PS5 egineer??? I'm asking you because I already listened rumors around 11Tflops. These rumors came from a Gamespot insider...
:messenger_open_mouth: Are you hiding something?
 
Last edited:
I think traditional model for next generation consoles is going away.
PS and Xbox will be just Api much like Android version and game makers will be able to target what they want.
That's debate for another topic entirely over the posibility of a clean next gen break for PS6, i personally don't think its going away because no clean breaks would defeat the purpose of new hardware, even PC games will be affected

What we know for sure is that a clean break PS5 is happening so you can rest assured it won't be no half step upgrade
Ghost of Tsushima graphics looks one step ahead from everything we've seen. Do you believe that a simple PS4 could handle that visuaIs?
Personally i don't think it looks a step above the best looking 1st party from sony already out, not even TLOU2 looks out of base PS4 reach, its the incremental upgrade i expected from nd mastery of the hw for their second title (or third if you count the remaster)

first of all, you got me wrong there: i didn't rule out the ps4 using 24gb of ram (my list is pretty much the minimal spec what would make sense for a NG System). i said i don't think sony would delay a console just for providing a bigger amount of it. if they decide to go with 24GB they will eat up the initial extra cost and won't delay launch for that.
I agree i don't expect a delay longer than a year i also don't expect next gen earlier than late 2020

materials look more realistic (for this they most of all had to look less blurry as in late this gen games like cyberpunk [material quality being the main reason i cant understand people are calling this next gen]) are more shading than memory intensive. for the richer worlds stuff: simulation doesn't need sheer RAM amount but huge amounts of bandwith. for bandwith you can simply widen your mermory bus. im pretty sure consoles will got with a 384bit (and not 256bit) bus in that regard
Well the thing is you are limiting your thinking scope to current gen development paradigm, when this gen started 2gb vram was more than enough and see how quick that changed? Once you have games developed from the ground up the 4k textures will have instant apparent results

For rich dynamic words i mentioned the importance of memory, of course you need ample amounts of bandwidth to feed said memory and powerful gpu/cpu to power those worlds and simulations, all components are important and a deficiency in one of them would limit the others

I think ample amounts of memory will be important for next gen engines lightning buffers, 4k textures and to still have enough left to store information of rich, detailed, interactive, diverse and dynamic worlds and be able to keep up with players actions around said world

That along with lightning is what will set next gen apart from the current one imo
 
Last edited:
That's debate for another topic entirely over the posibility of a clean next gen break for PS6, i personally don't think its going away because no clean breaks would defeat the purpose of new hardware, even PC games will be affected


1. Consoles already have PC hardware in them, IF PC can run whole range of games up from literally 90' then there is no reason why console which will have basically same type of hardware just newer would be able to play all PS4 games natively. Swapping HD7850+ which is in PS4 base for Vega56+ in PS5 is no different than swapping it on PC. If computers can do it then for consoles it is even easier.
2. In similar fassion. There is no reason why PS4 couldn't run newer titles just wit reduced details and resolution. It used to be a problem because consoles usually had custom hardware in them which meant that you either had to change a lot API or emulate, in both cases you either way ended up in scenarion in which not all games could run
3. All manufacturers already realized that point. To start new generation you risk of starting "a new". Meaning that your console might not actually get picked up for some weird reason or one or another and your people would run away to competition. I for 100% predict Xbox2 being like i said, they already mentioned next Xbox will be family of devices which suggest that Xbox will be an API only not console (but they will sell console to that API) and with them forcing to use DX all games will be either way compatible with PC. From get go they will have MILIONS OF USERS without selling even one unit.
4. Developers already made input on that that in those times it is way to expensive to make game for new consoles with very low sold numbers. They already talked on several occasions they would want to see consoles moving toward phone like generations where games can be released regardless of what current console is on horizon. Console should be only a way to maximize visuals in easy way. Aka i have 4k TV and i want 60fps then maybe i should buy PS5Pro. I have 1600x900 TV and i am fine with 30fps then i will stay on PS4 for a while to play GTA6 there.
5. Creating ports is needless in situation in which there are no generations but only API. If you release GTA5 on PS5 it could be played on PS4, PS5, PS6789 and so on as long as they will be releasing updated API with support for older hardware.

To sum it up, it makes much more business sense to keep your user-base than to lose it every 5 years in hope of maybe getting it back.
 
Its about developing for the base spec, reason why PC games improve drastically after a new gen raises the base spec bar
Problem with your comparison is developers use consoles as their base spec, games scope would be severely limited if PS360 was the base spec still

4. This is the reason cross gen titles exist, this generation showed most of the active user base switched after the second year the gen started
5. backwards compatibility fixes this
It also makes more business sense to keep releasing cheap ports and remasters
 
Last edited:
This is the reason cross gen titles exist,

That is literally my point. For end user what i said it means that every game will be cross gen now, not only forward but also backward compatibile BUT there will be games that will be exclusive to PS5 much like you have android games exclusive to much later version of android.

There is no custom hardware anymore which means there is no reason for "generations" in first place.
Consoles generations were made in such way because old hardware was not capable of playing new games fundamentally. Difference was just too vast. Right now it is not the case. Todays best looking game will be still good looking game on PS5.

Secondly your "base model" comment do not hold validity when PC market exist and developers shown that they can calculate themseves what to target. They will see what public they want to target and they will on their own decide if they want to release game for PS5 only because they want to go for best graphics or maybe they will be doing indie title so they want best reach.
 
That is literally my point. For end user what i said it means that every game will be cross gen now, not only forward but also backward compatibile BUT there will be games that will be exclusive to PS5 much like you have android games exclusive to much later version of android. There is no custom hardware anymore which means there is no reason for "generations" in first place.
Consoles generations were made in such way because old hardware was not capable of playing new games fundamentally. Difference was just too vast. Right now it is not the case. Todays best looking game will be still good looking game on PS5.
Its not so much the difference in arquitecture, its about the new experiences the new base spec enables that just wouldn't be possible or severely limited if developed around lower spec. This gen the market dictated previous consoles to no longer needed after the first year

Secondly your "base model" comment do not hold validity when PC market exist and developers shown that they can calculate themselves what to target.
The PC AAA market is dictated by consoles that's my point they have consoles as their minimum target
 
Last edited:
Why would anyone want 8k blurays? UHD disc size hasn't increased, you would have 8k movies compressed to fit the same 100gb of 4k movies. Even if they add another layer I don't see the extra 60gb being enough for the pixel increase between 4k and 8k. If they add another layer then the price will obviously increase, but practically nobody has a 8k TV anyway. You would need a HVD disk, instead of bluray, for proper quality 8k movies but HVD seems to be vaporware now.

Did you miss the whole point of me talking about AV1? AVI will allow them to get more in a smaller size & also there is Blu-ray formats that can hold more than 100GB . also the no one has 8K talk is silly because if PS5 is coming out in 2019 - 2020 it's going to be on the market through 2025


CES: Blu-ray Disc Association Starts Licensing of 8k/4K Broadcast Recordable Blu-ray Format


Recording and archiving broadcast content is very popular in Japan and the Blu-ray Disc Association has started licensing the next-gen broadcast recordable Blu-ray format.

It is anticipated that the Ultra HD broadcasting will start in Japan around the end of 2018. The new Blu-ray disc standard will allow Japanese consumers to record 4K / 8K satellite broadcasts on BD-R and BD-RE discs.
8K satellite streams will be recorded and reproduced at up to 100Mbps, allowing consumers to maintain the original quality of the 8K broadcasts, possibly on the recently released BD-RE XL media that support 4x playback. More than two hours of 8K content (100Mbps) can be recorded on BDXL discs (128GB - quad-layer, 100GB triple-layer).
For recording 4K broadcasts, conventional BD-R/BD-RE media can be used.
The new standard supports
ADVERTISEMENT




the HEVC video codec of up to a maximum of 8K/60P as well as HDR (hybrid log gamma method). Furthermore, the MPEG4-AAC and MPEG4-ALS audio codecs are also supported.

The new 4K/8K Satellite TV recordings will be protectd by the AACS2 system.
The BDA is working closely with industry parties to develop tools and processes required to ensure the interoperability between recorders, players and software.


 
Did you miss the whole point of me talking about AV1? AVI will allow them to get more in a smaller size & also there is Blu-ray formats that can hold more than 100GB . also the no one has 8K talk is silly because if PS5 is coming out in 2019 - 2020 it's going to be on the market through 2025


CES: Blu-ray Disc Association Starts Licensing of 8k/4K Broadcast Recordable Blu-ray Format

So this AV1 will make an 8k movie fit 128GB with the same or better quality (regarding compression artifacts) of a 4k movie in a 100GB disc?
Do you have to sacrifice a chicken before watching a movie? Because considering 8k is 4x more pixels than 4k, that sound almost like dark magic.
 
Last edited:
there is no "TFLOP Advantage" for nividia cards in a console environment. if you build your rendering pipeline in a way that it can tap all available 11 TFLOPs, 11 TFLOPs is 11TFLOPs. so V56 performance would be much more like 1080ti performance in an console environment. there are other bottlenecks that nvidia is solving better than amd (like color compression), but that has nothing to do with computational power...

Consoles eliminate some of the difference in AMD vs Nvidia drivers API overheads...What they don't do is eliminate hardware design choice.

Flops are a really simple paper calculation - Shader ALUs * clock speed * how many operations per clock you can do (2).
If your design is to have a lot of smaller cores that do less each, but you can pack more in, you get more flops on paper.
Now if you designed something with the same end performance that instead had fatter cores that on average reach closer to that 2 ops/clock, and for simplicity assuming the same clock, you have less flops on paper, despite the same performance.

To say consoles eliminate this hardware design choice difference shows a misunderstanding of what's going on. Neither approach is necessarily better or worse, but the difference in ratios is fundamentally baked into the hardware, even if a console lets you utilize a set (AMD) chip better. You'd also get higher utilization out of Nvidia GPU had they used that too.
 
Last edited:
So this AV1 will make an 8k movie fit 128GB with the same or better quality (regarding compression artifacts) of a 4k movie in a 100GB disc?
Do you have to sacrifice a chicken before watching a movie? Because considering 8k is 4x more pixels than 4k, that sound almost like dark magic.

AV1 isn't the only upcoming codec h.266 is also coming but the hardware isn't going to be out until 2021 so I only said AV1 because the hardware should be out in time for PS5 & there will be Blu-ray disc that can hold more than 128GB.
 
AV1 isn't the only upcoming codec h.266 is also coming but the hardware isn't going to be out until 2021 so I only said AV1 because the hardware should be out in time for PS5 & there will be Blu-ray disc that can hold more than 128GB.
But new and expensive drives will be required to read the next-gen bluray disks, because they aren't just stacking layers. Current BD-XL drives can only read up to 4 layers - 128gb (The max capacity disk that can be read by normal bluray laser is 300gb, but is dual sided and triple layered. Not practical at all and very expensive to be used for movies) . Do you expect next-gen to have those new drives plus all the other things for 399$?
 
Consoles eliminate some of the difference in AMD vs Nvidia drivers API overheads...What they don't do is eliminate hardware design choice.

Flops are a really simple paper calculation - Shader ALUs * clock speed * how many operations per clock you can do (2).
If your design is to have a lot of smaller cores that do less each, but you can pack more in, you get more flops on paper.
Now if you designed something with the same end performance that instead had fatter cores that on average reach closer to that 2 ops/clock, and for simplicity assuming the same clock, you have less flops on paper, despite the same performance.

To say consoles eliminate this hardware design choice difference shows a misunderstanding of what's going on. Neither approach is necessarily better or worse, but the difference in ratios is fundamentally baked into the hardware, even if a console lets you utilize a set (AMD) chip better. You'd also get higher utilization out of Nvidia GPU had they used that too.

So you guys are talking that Xbox One X 6.2Tflops(RX 580), could reach the same GTX 1070 6.5Tflops performance despite the same RX 580 on PC can't handle the same GTX 1070 performance???
I didn't know it... I didn't know that console APU architectures are able to solve a lot of problems and incompatibilities in comparison to the same GPU running on PCs.

So a PS5 with 10.5Tflops could be faster than a PC with VEGA 56(10.5Tflops)? Something close to the GTX 1080?
 
So you guys are talking that Xbox One X 6.2Tflops(RX 580), could reach the same GTX 1070 6.5Tflops performance despite the same RX 580 on PC can't handle the same GTX 1070 performance???
I didn't know it... I didn't know that console APU architectures are able to solve a lot of problems and incompatibilities in comparison to the same GPU running on PCs.

So a PS5 with 10.5Tflops could be faster than a PC with VEGA 56(10.5Tflops)? Something close to the GTX 1080?

Look what the PS4 is doing with a mere 1.84TF with games like God of War, UC4, Spider-Man, Horizon, Detroit, and soon to be GoT and TLoU2.

They look discernible from the Pro versions. The assets are usually pretty much the same, other than resolution.
 
So you guys are talking that Xbox One X 6.2Tflops(RX 580), could reach the same GTX 1070 6.5Tflops performance despite the same RX 580 on PC can't handle the same GTX 1070 performance???
I didn't know it... I didn't know that console APU architectures are able to solve a lot of problems and incompatibilities in comparison to the same GPU running on PCs.

So a PS5 with 10.5Tflops could be faster than a PC with VEGA 56(10.5Tflops)? Something close to the GTX 1080?
(much) less overhead, locked hardware means much finer tuned optimization and APIs written specifically for that hardware, including low level APIs that pcs don't have (the closest would be Vulkan? ) (at least Ps4 has GNM for low level and GNMx for high level, not sure how xbox APIs work).
You don't get more than the max theoretical flops from consoles hardware, you get closer to full use of its capacity compared to pcs.
 
But new and expensive drives will be required to read the next-gen bluray disks, because they aren't just stacking layers. Current BD-XL drives can only read up to 4 layers - 128gb (The max capacity disk that can be read by normal bluray laser is 300gb, but is dual sided and triple layered. Not practical at all and very expensive to be used for movies) . Do you expect next-gen to have those new drives plus all the other things for 399$?

If these Sony cartridges with 12 disc in them can hold 3.3TB that means that they have disc that can hold 275GB already (11 layers of 25GB I'm guessing )
sony_6.jpg





1st gen held up to 1.5TB in 2013 & 2nd gen hold 3.3TB in 2016 & 3rd gen will be 6TB but there isn't a release date but I would guess 2019 or 2020 & they should have 500GB per disc
 
Last edited:
So you guys are talking that Xbox One X 6.2Tflops(RX 580), could reach the same GTX 1070 6.5Tflops performance despite the same RX 580 on PC can't handle the same GTX 1070 performance???
I didn't know it... I didn't know that console APU architectures are able to solve a lot of problems and incompatibilities in comparison to the same GPU running on PCs.

So a PS5 with 10.5Tflops could be faster than a PC with VEGA 56(10.5Tflops)? Something close to the GTX 1080?


That's the opposite of what I said :p
Read it again?

Distilled form: Two different issues, consoles help with differences in driver overhead and will also bring up hardware utilization rate, but they don't change the fundamental design decisions that lead to different numbers of flops for the same performance. If you have smaller cores doing less, but in more numbers, at the same clock speed, you have higher flops on paper. If you have the same performance, but larger cores that do more but you use less of, you have less flops on paper, for the same performance. Consoles do not change fundamental hardware design decisions.

So increased utility, yes. But not
. if you build your rendering pipeline in a way that it can tap all available 11 TFLOPs, 11 TFLOPs is 11TFLOPs. so V56 performance would be much more like 1080ti performance in an console environment.

It does not get around in-built hardware design ratios.
 
Last edited:
That's the opposite of what I said :p
Read it again?

Ok... But I can believe that consoles can reach better performance than a PC with the same specs because of the optimizations...
I've heard that RX 580 with 6.2Tflops can't do the same things that Xbox One X can do with the same hypotetical specs... I don't remember what post or thread spoke something about...

But anyway... Do you think it's worth buying a Xbox One X than a RX 580? Maybe your answer will clear my mind...
 
Look what the PS4 is doing with a mere 1.84TF with games like God of War, UC4, Spider-Man, Horizon, Detroit, and soon to be GoT and TLoU2.

They look discernible from the Pro versions. The assets are usually pretty much the same, other than resolution.

That's because PS4 and PS4 Pro both have the same amount of RAM. PS4 Pro can't add 4K texture assets even if they wanted to due to the 8GB RAM limitation and slower memory speed. Digital Foundry talked about this when they were comparing the Xbox One X to the PS4 Pro last year. The 12GB of RAM on the Xbox One X allows developers to add higher-quality textures on X-enhanced games.
 
Ok... But I can believe that consoles can reach better performance than a PC with the same specs because of the optimizations...
I've heard that RX 580 with 6.2Tflops can't do the same things that Xbox One X can do with the same hypotetical specs... I don't remember what post or thread spoke something about...

But anyway... Do you think it's worth buying a Xbox One X than a RX 580? Maybe your answer will clear my mind...

They absolutely get used better, I just disagreed with the other guy saying consoles nuke all variables such that flops are flops all around, that's not right.

I guess I could have explained it this way: Say it's a CPU core instead. You have a 6-wide core, and a 3-wide core, and everything else is the same. Now, if you built your program entirely around the 3-wide core, you'd use it better, but you'd never completely beat the variable that the other core is wider and can do more each cycle, even at the same clock speed. Kinda what I'm saying, no analogy being perfect.


As for what to buy - all about what you like playing really, I dunno!
What I tend to like doing is building a PC one tier after a new generation console has launched, like the 8800 that lasted for-freaking-ever since it launched the same month as the PS3 but was vastly better. But that could be a lot more waiting than you want to do, so, I dunno, everyone's purchases are their own!
 
That's because PS4 and PS4 Pro both have the same amount of RAM. PS4 Pro can't add 4K texture assets even if they wanted to due to the 8GB RAM limitation and slower memory speed. Digital Foundry talked about this when they were comparing the Xbox One X to the PS4 Pro last year. The 12GB of RAM on the Xbox One X allows developers to add higher-quality textures on X-enhanced games.

I know the explanation. Still... 1.84TF and the fidelity they are putting out is astonishing to say the least.
 
They absolutely get used better, I just disagreed with the other guy saying consoles nuke all variables such that flops are flops all around, that's not right.

I guess I could have explained it this way: Say it's a CPU core instead. You have a 6-wide core, and a 3-wide core, and everything else is the same. Now, if you built your program entirely around the 3-wide core, you'd use it better, but you'd never completely beat the variable that the other core is wider and can do more each cycle, even at the same clock speed. Kinda what I'm saying, no analogy being perfect.


As for what to buy - all about what you like playing really, I dunno!
What I tend to like doing is building a PC one tier after a new generation console has launched, like the 8800 that lasted for-freaking-ever since it launched the same month as the PS3 but was vastly better. But that could be a lot more waiting than you want to do, so, I dunno, everyone's purchases are their own!

My favorite franchise is Resident Evil... My avatar demonstrates this clearly(Actress for Jill Valentine model). But I don't like PCs... I prefer all these optimizations being done by developers and just put the disck inside the box to start my game. I love consoles... But I also want to have the best experience possible in terms of graphics, services and joystick when I'll play Resident Evil 2 Remake next year. So I dicided to sell my PS4 PRO to buy a Xbox One X. I don't really care about SONY, Nintendo and Xbox exclusives... My favorite franchises are third parties... DOOM 2016 was one of the best current FPS and I'm anxious for DOOM ETERNAL. So I'll take a console that brings me something near to a PC with GTX 1070 and run the best Native 4k versions of my favorite games... I can wait until 2019/2020 to play TLOUS2, Ghost of Tsushima or Death Stranding when PS4 PRO SLIM or even a PS5 come out.
 
Last edited:
My favorite franchise is Resident Evil... My avatar demonstrates this clearly(Actress for Jill Valentine model). But I don't like PCs... I prefer all these optimizations being done by developers and just put the disck inside the box to start my game. I love consoles... But I also want to have the best experience possible in terms of graphics, services and joystick when I'll play Resident Evil 2 Remake next year. So I dicided to sell my PS4 PRO to buy a Xbox One X. I don't really care about SONY, Nintendo and Xbox exclusives... My favorite franchises are third parties... DOOM 2016 was one of the best current FPS and I'm anxious for DOOM ETERNAL. So I'll take a console that brings me something near to a PC with GTX 1070 and run the best Native 4k versions of my favorite games... I can wait until 2019/2020 to play TLOUS2, Ghost of Tsushima or Death Stranding when PS4 PRO SLIM or even a PS5 come out.


Sounds like you know what to buy then, the X is great for cross platform until next gen.
 
Consoles eliminate some of the difference in AMD vs Nvidia drivers API overheads...What they don't do is eliminate hardware design choice.

Flops are a really simple paper calculation - Shader ALUs * clock speed * how many operations per clock you can do (2).
If your design is to have a lot of smaller cores that do less each, but you can pack more in, you get more flops on paper.
Now if you designed something with the same end performance that instead had fatter cores that on average reach closer to that 2 ops/clock, and for simplicity assuming the same clock, you have less flops on paper, despite the same performance.

To say consoles eliminate this hardware design choice difference shows a misunderstanding of what's going on. Neither approach is necessarily better or worse, but the difference in ratios is fundamentally baked into the hardware, even if a console lets you utilize a set (AMD) chip better. You'd also get higher utilization out of Nvidia GPU had they used that too.

The ps4's design had a focus on increased compute performance, iirc. This probably was chosen under the guidance of some of the top developers in the business.
 
Last edited:
Did you miss the whole point of me talking about AV1? AVI will allow them to get more in a smaller size & also there is Blu-ray formats that can hold more than 100GB . also the no one has 8K talk is silly because if PS5 is coming out in 2019 - 2020 it's going to be on the market through 2025
honestly 8k feels way too premature, by 2025 4k will just start becoming prevalent in peoples houses
Unless the switch from silicone happens in the next 10 years i hope sony and ms don't do dumbass shit like pushing 8k gaming
That's because PS4 and PS4 Pro both have the same amount of RAM. PS4 Pro can't add 4K texture assets even if they wanted to due to the 8GB RAM limitation and slower memory speed. Digital Foundry talked about this when they were comparing the Xbox One X to the PS4 Pro last year. The 12GB of RAM on the Xbox One X allows developers to add higher-quality textures on X-enhanced games.
Right this is why i think 16gb is wholly inadequate for a proper next gen jump
I can wait until 2019/2020 to play TLOUS2, Ghost of Tsushima or Death Stranding when PS4 PRO SLIM or even a PS5 come out.
I say wait for black Friday to get a good deal on X. Hopefully PS5 is backwards compatible and you can just play those games there
 
128 GB ram
no HDD or SSD ... everything runs in a RAM DISK

just kidding
...

I had a computer that I did SSD caching... I think that's the way to go at this point... HDD for storage... and maybe 32 or 64 GB SSD for caching... I got the same performance as an OS directly installed to an SSD... ... so like a fusion drive.
Maybe that could alleviate load times and terrible slow HDD bottlenecks
 
HDs with SSD caching already exists and I use one in my PS4 Pro, they're called SSHD. Still likely cost prohibitive for a console.
 
So you guys are talking that Xbox One X 6.2Tflops(RX 580), could reach the same GTX 1070 6.5Tflops performance despite the same RX 580 on PC can't handle the same GTX 1070 performance???
I didn't know it... I didn't know that console APU architectures are able to solve a lot of problems and incompatibilities in comparison to the same GPU running on PCs.

So a PS5 with 10.5Tflops could be faster than a PC with VEGA 56(10.5Tflops)? Something close to the GTX 1080?
Xbox X GPU and RX 580 are not the same GPU. Xbox X is custom made GPU, unlike RX580 it has few vega features (the same memory bandwith compression method), and it has build dx12 on the hardware level so GPU and CPU works more efficient. Also shader compiler is more efficient on xbox x. To sum it up xbox X GPU is faster compared to RX 580, these are not the same cards anymore. PS4P has 4.2 tflops polaris GPU, yet xbox X with just 1.8 tflops more can render up to 2.2x as many pixels, so it's shows how much faster xbox x architecture is compared to ps4p.
 
Xbox X GPU and RX 580 are not the same GPU. Xbox X is custom made GPU, unlike RX580 it has few vega features (the same memory bandwith compression method), and it has build dx12 on the hardware level so GPU and CPU works more efficient. Also shader compiler is more efficient on xbox x. To sum it up xbox X GPU is faster compared to RX 580, these are not the same cards anymore. PS4P has 4.2 tflops polaris GPU, yet xbox X with just 1.8 tflops more can render up to 2.2x as many pixels, so it's shows how much faster xbox x architecture is compared to ps4p.

The Pro GPU is also custom with some Vega features as well.

https://www.eurogamer.net/articles/...tation-4-pro-how-sony-made-a-4k-games-machine

The memory bandwidth is where the X excels mainly.
 
Last edited:
EXACLY! :messenger_sunglasses:

But I still thinking that PS5 will run 4K Checkerboard. Because if this console come out with the same GTX 1080 performance it will never be able to deliver a big generational jump in termos of graphics running native 4k. Except if this console reach the GTX 1080Ti performance :messenger_fearful:

The true native 4K console will be the Xbox Scarlet that will come out after PS5. Maybe PS5 PRO and Xbox Scarlet X will be able to run Native 4k at 60fps.

But most games will be 30 fps anyway, which is fine. PS5 will be a 4K console.

Yeah i too think we'll see lots of checkerboarded titles to get the most out of the GPU even if we got the equivalent of GTX 1080 Ti performance🤤🤤🤤

And hopefully, many devs find out how to use Insomniac's temporal AA injection because the newest Spiderman game look super sharp and crisp. I think someone said that game is like 1600p right?
 
HDs with SSD caching already exists and I use one in my PS4 Pro, they're called SSHD. Still likely cost prohibitive for a console.


The 32GB NAND cache ones are just around 50 bucks for 1TB, all in one.

https://www.amazon.com/dp/B01C4W2AK4/?tag=neogaf0e-20

It would add some, I dunno, dozen dollars to the BoM over a standard drive, but it would seem very worth it.


If next gen has 24, maybe more, gigs of RAM, we'll really want it filled fast, it's why the 8th gen went to hard drive installation becoming (nearly?) universally mandatory, RAM and disks got larger and took longer to fill.

Another 2.5 times that memory, and we'll be wanting a solution that can fill it faster. 7200rpm drive with NAND cache would be one way (I have a 7200rpm SSHD drive in my PS4 right now, noise heat and vibration are not issues).




That said, I suppose they could always just pass that cost to users who do want to upgrade and leave the drive upgradable, using a standard drive for everyone else to use that little BoM increase elsewhere. But the standard user experience may be slower than it is now then.
 
Last edited:
I'm guessing he meant "Also"

Ok lol :) .. i always thought it was interesting Sony and MS put in customisations .. how can they know AMD GPUs better than AMD? Im guessing they are features noone can easily use on PC but console devs will actually use.
 
The Pro GPU is all custom with some Vega features as well.

https://www.eurogamer.net/articles/...tation-4-pro-how-sony-made-a-4k-games-machine

The memory bandwidth is where the X excels mainly.
PS4P has useless FP64 support in normal games while xbox x has very efficient memory bandwidth compression method that works in every game (and it's especially usefull in higher resolutions), and on top of that it has DX12 build into a chip. You dont have anything like that on PS4P. All you have to do is look at Xbox X games results compared to PS4P, NIGHT and DAY difference, the difference is higher than TFLOPS numbers alone suggest, because it's much more efficient architecture. Xbox X GPU is really a beast even compared to PC, for example shadow of the tomb raider runs at dynamic 4K on xbox X (1800p dips), and that's GTX 1070 results territory. GTX 1070 results from tomb raider 23 fps at 4K (so 1800p should be around 30fps like on xbox X)
 
Last edited:
Xbox X GPU and RX 580 are not the same GPU. Xbox X is custom made GPU, unlike RX580 it has few vega features (the same memory bandwith compression method), and it has build dx12 on the hardware level so GPU and CPU works more efficient. Also shader compiler is more efficient on xbox x. To sum it up xbox X GPU is faster compared to RX 580, these are not the same cards anymore. PS4P has 4.2 tflops polaris GPU, yet xbox X with just 1.8 tflops more can render up to 2.2x as many pixels, so it's shows how much faster xbox x architecture is compared to ps4p.

Some users said that consoles use to eliminate AMD drivers problems... And said that 1.8, 4.2 or 6.2Tflops are fully exploited in the console enviroment. Do you agree with that? Why RX 580 with 6.2Tflops doesn't reach something near to the GTX 1070 6.5Tflops. RX 580 is much more like a GTX 1060 with 4.2Tflops :messenger_neutral:

If you want to reach the same GTX 1070 6.5Tflops on your PC, you'll need to buy a VEGA 56 that have 10.5... I know that TeraFlops doesn't even matter when we're talking about all GPU specs. But if these users understand about hardware and if they telling the truth, PS5 will have the same performance of a GTX 1080Ti even using AMD GPUs like a VEGA 56 or similar. Being 10.5Tflops able to bring an amazing performance near to the GTX 1080Ti with it's 10.8Tflops.

Could you currect me if I'm wrong? Is Xbox One X near to the GTX 1070 performance Or Is it much more like a GTX 1060 as it happens in PCs.
 
PS4P has useless FP64 support in normal games while xbox x has very efficient memory bandwidth compression method that works in every game (and it's especially usefull in higher resolutions), and on top of that it has DX12 build into a chip. You dont have anything like that on PS4P. All you have to do is look at Xbox X games results compared to PS4P, NIGHT and DAY difference, the difference is higher than TFLOPS numbers alone suggest, because it's much more efficient architecture. Xbox X GPU is really a beast even compared to PC, for example shadow of the tomb raider runs at dynamic 4K on xbox X (1800p dips), and that's GTX 1070 results territory. GTX 1070 results from tomb raider 23 fps at 4K (so 1800p should be around 30fps like on xbox X)

This post isn't actually true.
 
Some users said that consoles use to eliminate AMD drivers problems... And said that 1.8, 4.2 or 6.2Tflops are fully exploited in the console enviroment. Do you agree with that? Why RX 580 with 6.2Tflops doesn't reach something near to the GTX 1070 6.5Tflops. RX 580 is much more like a GTX 1060 with 4.2Tflops :messenger_neutral:

If you want to reach the same GTX 1070 6.5Tflops on your PC, you'll need to buy a VEGA 56 that have 10.5... I know that TeraFlops doesn't even matter when we're talking about all GPU specs. But if these users understand about hardware and if they telling the truth, PS5 will have the same performance of a GTX 1080Ti even using AMD GPUs like a VEGA 56 or similar. Being 10.5Tflops able to bring an amazing performance near to the GTX 1080Ti with it's 10.8Tflops.

Could you currect me if I'm wrong? Is Xbox One X near to the GTX 1070 performance Or Is it much more like a GTX 1060 as it happens in PCs.
Xbox X Is faster than 1060 and RX 580 for a fact, and by now you can see it across many games.

GTX 1060 + 2GHz (so results are 10fps better than stock model). 45fps-55fps 4K dynamic, medium/low settings, and 35-45fps 4K native. Now compare that to xbox X results, 55-60fps, 4K dynamic and higher settings, and 4K native 45fps+. So even with OC and worse graphics details GTX 1060 is still around 10 fps slower.

So basically, MS have improved AMD GPU a LOT to the point 6tflops number is much more efficient than the same 6tflops on any other AMD card on PC.
 
This post isn't actually true.
You can read PC benchmarks from tomb raider for yourself
http://www.pcgameshardware.de/Shado...Shadow-of-the-Tomb-Raider-Benchmarks-1264575/
GTX 1070 in 4K 23fps

and here you can read about xbox X GPU, for example about delta color compression
https://gamingbolt.com/behind-the-xbox-one-xs-architecture-part-i-what-makes-the-gpu-special
https://gpucuriosity.wordpress.com/...der-cache-size-advantage-over-the-older-gcns/

https://www.neogaf.com/threads/xbox-scorpio-dx12-built-directly-into-gpu.1358475/#post-233477389
"The bottom line is that Scorpio's six teraflops will almost certainly go a lot further than an equivalent PC part. I asked Microsoft about this specifically, and they raise a number of good arguments that make the case strongly."

That Digital Foundry quote says it all. Xbox X GPU is basially RX 580 on steroids, so it should be no surprising it's faster than RX 580.
 
Last edited:
The 32GB NAND cache ones are just around 50 bucks for 1TB, all in one.

https://www.amazon.com/dp/B01C4W2AK4/?tag=neogaf0e-20

It would add some, I dunno, dozen dollars to the BoM over a standard drive, but it would seem very worth it.


If next gen has 24, maybe more, gigs of RAM, we'll really want it filled fast, it's why the 8th gen went to hard drive installation becoming (nearly?) universally mandatory, RAM and disks got larger and took longer to fill.

Another 2.5 times that memory, and we'll be wanting a solution that can fill it faster. 7200rpm drive with NAND cache would be one way (I have a 7200rpm SSHD drive in my PS4 right now, noise heat and vibration are not issues).




That said, I suppose they could always just pass that cost to users who do want to upgrade and leave the drive upgradable, using a standard drive for everyone else to use that little BoM increase elsewhere. But the standard user experience may be slower than it is now then.
I'm wondering if they would just build the NAND into the system design itself... that way you could just buy and use any HDD for an upgrade. I think that would allow them to customize the caching for games at a system level... maybe ... who knows.
 
Top Bottom