sonomamashine
Member

Honestly, it's going both ways. As soon as the 12TF rumor for Xbox came out, suddenly "insiders" started to question whether it was RDNA even though it was said a million times that it was Navi. Then the rumor of a 9TF PS5 gained some traction From somewhat reputable sources, then suddenly It was 13 TF. Now both consoles are marginally the same, with the Xbox having a slight edge. So sure MS may have some shills but Sony does also. It just depends on which side of the fence you are rooting for. I'm rooting for both. If both console makers make bad ass hardware, my PC will actually be forced to use its muscle.
Depends on how many teams are working on it. I don't think Microsoft would risk not having next-gen graphics on their 12TF machine. I mean, look at the anniversary editions of Halo. You can turn the old and new graphics engines on and off and there are very clear noticeable differences. Ray tracing will definitely help with that as well.
Looks like SeX at 12TF and PS5 at 9.2TF every day that passes.
Sony being too chicken to tell gamers about PS5 doesn't help.
Last gen, PS4 was already itching the reveal and ready to PR 8gb GDDR5 ram.
Sony doesn't seem to excited this time.
Perfect comment, I agree with you a 100%
You can see YouTubers like Colteastwood and Dealer Gaming and how they've been making videos about how powerful the Xbox Series X will be over the PS5 EVEN when there are verified reports and reputable people like Colin Moriarty and Andrew Reiner stating the PS5 is more powerful, it's really embarrassing and disgusting to see how Xbox fanboys lie and misinform people into thinking this way when reality isn't like that, I just wanna see their reactions when the PS5 really is as powerful as XSX or more because I can guarantee you that this will happen, too much reports and evidence behind it happening.
I remember someone named Klobrille IMMEDIATELY chimed in saying XSX is "clearly more advanced and more powerful than PS5" after that 1st WIRED article by Sony about the PS5, and that Klobrille statements turned out to be fake and it was just what Microsoft is targeting, not what Sony had and what Microsoft will have, and Colteastwood has been making videos for over a year talking about how XSX will be more powerful when in fact, all reports are saying otherwise.
It'll be sooo funny when the final specifications come out and PS5 may end up having the more powerful parts, oh man, the reactions.![]()
Why would the boss of their studios have an impact? He's just a figurehead. Doesn't impact hardware development at all and should not impact software development either. It's not like a company stops functioning if they get a new CEO.
Mod of War User camefromthenearfuture was banned because he refused verification process and continued to pose as an "insider", even after several warnings. User xcloudtimdog does the same, yet you let him be. Partiality.
Github is owned by Microsoft not Google.
He has patent to disable cu . It doesn't have to be half of cu . Half makes it easier Ofcourse but doesn't have to be half of you read the patentAll evidence so far on how Sony handle BC is by directly mimicking the old hardware as close as possible. They did it with PS4 Pro, they have patents suggesting it, and then there is Oberon which does the same. PS4 to PS4 Pro was 18 to 36CU's with ability to disable half for BC.
One thing to consider is that they will also be planning ahead right now with how to do PS5 BC on either PS5 Pro or PS6. So quite possibly it's the same method. 36CU's to 72CU's on PS5 Pro / PS6. It's a straight forward tidy system of doubling the CU's and achieves their goals.
Agreed. It's silly seeing some folks label others fanboys because they take GitHub leak into consideration. GitHub leak does have issues with it, but I think it still has validity in discussion since there would be no reason to test an old APU in middle of 2019. Saying that it was tested just to see what limits it could be pushed to also doesn't make sense since testing an old APU with less CUs than the apparent new one would be pointless. It could maybe be old results being posted that late, but testing supposed 2019 release date APU in 2019 makes zero sense. The other explanation is that Sony shortly after the tests went with a new GPU for the APU and made very late changes, which is the most believable of all the possibilities if the new 11-12 tflop PS5 specs are true.
Still doesn't make sense testing a supposed old chip that late, surely they would have been testing a new one for back compat if it was ready. Which again could mean that the supposed new chip wasn't ready at the time of these tests. But then again, they would have to test the new chip for back compat again anyway. There are many unanswered questions regarding the GitHub leak.You are missing one hipótesis, if the tests are only for back-compat mode (and assuming PS5 will use the same method as the Pro), Oberon only has 36CUs active to 'mimic' the legacy hardware.
I don't have access to all the data, and don't know if someone here has, that would be the only way to confirm it.
Because their final chip is not ready and they don't want to lose time when they could use their old chip from old dev kits to do BC regression tests ? I mean there are many reasons if they have more that 1 chip .(if they have )This is something the naysayers keep seeming to dismiss. If Oberon has no relevance, why would it still be getting tested for benchmarks? Why have even later steppings since been datamined?
That said, I don't think those tests are showing the full picture. Another poster like two weeks ago, made a mention about PS4 Pro chip testing having a block of CUs disabled in order to mirror PS4 compatibility; however there were of course more CUs on the chip than the tests actually showed. I think a similar thing might be the case with Oberon (and for all we know, possibly Arden as well), to mirror the setup Sony used with PS4 Pro.
There was a graphic estimating chip sizes and CU counts (among other things) based on the data available and that particular graphic mentioned a possible 48 CUs on Oberon. That could be possible; it'd be about 10.75TF @ 1750MHz. Assuming by some chance they could actually push it to 2000MHz, that would be 12.28TFs, but that is well past Navi's sweetspot. So at 1800MHz, it'd give 11.058TF. I'm just going with 48CUs as that's what the graphic posted on ResetEra (I know, Era's garbage but the graphic itself was well-done and logical), but you can see how that does give an 11-12TF range when clocking at Navi's sweetspot.
Now the only thing that isn't explained, is why has Oberon been tested at 2GHz? Some of the later Oberon steppings seem to have made some bug fixes (such as to the memory, possibly expanding the bandwidth). It could be that the version in the Github leak had problems with turning on an extra set of CUs. So far the post-Github datamines still seem to list the chip at 40 max CUs, so with Oberon looking like a persistently tested chip this close to launch, rather than some magical other chip popping up, I'm hoping there's a datamining of an Oberon stepping with 48-52CUs (or hell, possibly more than that) on it that surfaces. Maybe closer to GDC, we'll see.
Right now I'm in the camp Oberon is still very much the PS5's chip, but there are very likely a chunk of CUs disabled we aren't seeing, that could be due to bugs in the silicon that should be fixed with some later stepping. And that it's very likely the person who did the chip graphic estimate for PS5 and XSX (that was posted on Era) could be correct with estimating around 48 active CUs, lining up with another poster who mentioned that PS4 Pro's GPU had a chunk of disabled CUs for testing PS4 compatibility, and given Cerny headed both that and now PS5, he'd very likely employ a similar design choice with Oberon.
All we need is a later Oberon stepping with more active CUs on it to confirm all of this. With taking Navi's sweetspot into consideration, an 11.058 TF PS5 would be a hell of a beast, and fit within the ~ 10% range some insiders have claimed (I mean for all we know, XSX could indeed be a 56CU chip but clocked really low ATM, such as @1450MHz, pegging it around 10.3936TFs currently and they're just waiting to up the clocks later. Of course, 1450MHz is well below Navi's sweetspot). So in the end with that looking very possible, XSX could have a slight TF edge but it'd be less than 10% if they really are aiming just for 12TF; if the maximum difference of 10% is reached that would give XSX about 12.3648TF @1725Mhz.
So even in that case, the difference wouldn't be big, and we know both systems are using the same CPU designs (possible the L3$ could be different between them I guess), similar memory (GDDR6), and custom SSD as a cache (rumors of PS5's being possibly faster). But that looks like the most probable scenario for right now. I do know that PS4 Pro's GPU actually disabled half its chip for PS4 functionality; by that metric we could speculate Oberon has 72 active CUs potentially (or using dual Oberons in a chiplet setup). However, the reason I've dismissed this option personally is because it would be overkill for the numbers insiders themselves have been pegging for any upper limit to next-gen systems, as the chip(s) would have to be severely underclocked below Navi's sweetspot to hit "just" 12TF or even "just" 13TF. A waste of silicon and BOM that doesn't effectively maximize the potential, sounds very unlike Cerny.
(following is just pure radical speculation btw...)
Unless you could, say, "upgrade" the performance of such a setup by buying an optional custom cooling kit to install in the PS5, to basically turn it into a 'PS5 Pro' but without needing to purchase an actual PS5 Pro. That could bump the performance up to, say, 16.58TF if clocking the GPU(s) to 1800MHz. It's a super-wild possibility, and it would pretty much cut out any chance for Sony to get double-dip system buys in people purchasing a PS5 early on and then a PS5 Pro a few years later. But if they've decided that the Pro model approach wasn't ultimately profitable compared to the costs, and they still want to provide a pathway for owners to upgrade the specs, they COULD do that. They COULD eat costs on PS5 with such a big chip/dual chiplet GPU setup heavily underclocked with a "poorer" cooling solution built-in to hit 10-11TF @ $399 (for example), but provide a cooling upgrade for $100 - $150 that would make it a PS5 Pro by giving the system a much better cooling solution and upgraded PSU that's super-easy to install.
Can't quite wrap my head behind that type of idea as a business decision (plus it wouldn't answer other potential problems like the "upgraded" GPU(s) potentially being memory and bandwidth-starved due to the upclock)
That was obviously sarcasm directed at Tim. Me pandering to his want / need for a gaming box to be inferior. I'm sure you're being purposely obtuse as it was over the top and obvious.Care to point me towards where you said anything about a new Tsushima trailer or that the devkit is the final box? Because the search function doesn't return any such results.
Also for a person with no agenda you sure took some smashing hits at Tim when he says something that doesn't sound great about PS5. Not the reaction I would expect from an objective person.
I'm sorry what's this?
Which was most likely bullshit at that time too. Unless Sony wants a giant bulky console. Considering the Japanese market and how they don't like bulky electronics, (part of the reason Xbox was shunned) I really don't think their in a place to allow that. In order to bring the size down they'll need to bring the power down. But RDNA should help with that 9TF will be no slouch by any means also taking into considering what Sony did with 1.4TF.The PS5 rumor of 13 TFLOPS comes from April
The docs of the github leak were uploaded to google drive and AMD recently issued a DMCA to make Google take the data down. It proves the authenticity of the source, but little more beyond that.I'm sorry what's this?
Without the context of the tests it could be anything.Still doesn't make sense testing a supposed old chip that late, surely they would have been testing a new one for back compat if it was ready. Which again could mean that the supposed new chip wasn't ready at the time of these tests. But then again, they would have to test the new chip for back compat again anyway. There are many unanswered questions regarding the GitHub leak.
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
Though NeoGAF is an international forum, discussion is to be held in English unless otherwise designated.
Dude you got to translate that shit if you want to be understood... or i will start to reply to you in portuguese, percebido?secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
They can issue a DMCA on false information tooThe docs of the github leak were uploaded to google drive and AMD recently issued a DMCA to make Google take the data down. It peoves the authenticity of the source, but little more beyond that.
Still doesn't make sense testing a supposed old chip that late, surely they would have been testing a new one for back compat if it was ready. Which again could mean that the supposed new chip wasn't ready at the time of these tests. But then again, they would have to test the new chip for back compat again anyway. There are many unanswered questions regarding the GitHub leak.
So Xbox has 2 chips for Lockart and SeX
And why would sony only look at 1 chip and corner themselves ?
If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
First of all thisThis is something the naysayers keep seeming to dismiss. If Oberon has no relevance, why would it still be getting tested for benchmarks? Why have even later steppings since been datamined?
That said, I don't think those tests are showing the full picture. Another poster like two weeks ago, made a mention about PS4 Pro chip testing having a block of CUs disabled in order to mirror PS4 compatibility; however there were of course more CUs on the chip than the tests actually showed. I think a similar thing might be the case with Oberon (and for all we know, possibly Arden as well), to mirror the setup Sony used with PS4 Pro.
There was a graphic estimating chip sizes and CU counts (among other things) based on the data available and that particular graphic mentioned a possible 48 CUs on Oberon. That could be possible; it'd be about 10.75TF @ 1750MHz. Assuming by some chance they could actually push it to 2000MHz, that would be 12.28TFs, but that is well past Navi's sweetspot. So at 1800MHz, it'd give 11.058TF. I'm just going with 48CUs as that's what the graphic posted on ResetEra (I know, Era's garbage but the graphic itself was well-done and logical), but you can see how that does give an 11-12TF range when clocking at Navi's sweetspot.
Now the only thing that isn't explained, is why has Oberon been tested at 2GHz? Some of the later Oberon steppings seem to have made some bug fixes (such as to the memory, possibly expanding the bandwidth). It could be that the version in the Github leak had problems with turning on an extra set of CUs. So far the post-Github datamines still seem to list the chip at 40 max CUs, so with Oberon looking like a persistently tested chip this close to launch, rather than some magical other chip popping up, I'm hoping there's a datamining of an Oberon stepping with 48-52CUs (or hell, possibly more than that) on it that surfaces. Maybe closer to GDC, we'll see.
Right now I'm in the camp Oberon is still very much the PS5's chip, but there are very likely a chunk of CUs disabled we aren't seeing, that could be due to bugs in the silicon that should be fixed with some later stepping. And that it's very likely the person who did the chip graphic estimate for PS5 and XSX (that was posted on Era) could be correct with estimating around 48 active CUs, lining up with another poster who mentioned that PS4 Pro's GPU had a chunk of disabled CUs for testing PS4 compatibility, and given Cerny headed both that and now PS5, he'd very likely employ a similar design choice with Oberon.
All we need is a later Oberon stepping with more active CUs on it to confirm all of this. With taking Navi's sweetspot into consideration, an 11.058 TF PS5 would be a hell of a beast, and fit within the ~ 10% range some insiders have claimed (I mean for all we know, XSX could indeed be a 56CU chip but clocked really low ATM, such as @1450MHz, pegging it around 10.3936TFs currently and they're just waiting to up the clocks later. Of course, 1450MHz is well below Navi's sweetspot). So in the end with that looking very possible, XSX could have a slight TF edge but it'd be less than 10% if they really are aiming just for 12TF; if the maximum difference of 10% is reached that would give XSX about 12.3648TF @1725Mhz.
So even in that case, the difference wouldn't be big, and we know both systems are using the same CPU designs (possible the L3$ could be different between them I guess), similar memory (GDDR6), and custom SSD as a cache (rumors of PS5's being possibly faster). But that looks like the most probable scenario for right now. I do know that PS4 Pro's GPU actually disabled half its chip for PS4 functionality; by that metric we could speculate Oberon has 72 active CUs potentially (or using dual Oberons in a chiplet setup). However, the reason I've dismissed this option personally is because it would be overkill for the numbers insiders themselves have been pegging for any upper limit to next-gen systems, as the chip(s) would have to be severely underclocked below Navi's sweetspot to hit "just" 12TF or even "just" 13TF. A waste of silicon and BOM that doesn't effectively maximize the potential, sounds very unlike Cerny.
(following is just pure radical speculation btw...)
Unless you could, say, "upgrade" the performance of such a setup by buying an optional custom cooling kit to install in the PS5, to basically turn it into a 'PS5 Pro' but without needing to purchase an actual PS5 Pro. That could bump the performance up to, say, 16.58TF if clocking the GPU(s) to 1800MHz. It's a super-wild possibility, and it would pretty much cut out any chance for Sony to get double-dip system buys in people purchasing a PS5 early on and then a PS5 Pro a few years later. But if they've decided that the Pro model approach wasn't ultimately profitable compared to the costs, and they still want to provide a pathway for owners to upgrade the specs, they COULD do that. They COULD eat costs on PS5 with such a big chip/dual chiplet GPU setup heavily underclocked with a "poorer" cooling solution built-in to hit 10-11TF @ $399 (for example), but provide a cooling upgrade for $100 - $150 that would make it a PS5 Pro by giving the system a much better cooling solution and upgraded PSU that's super-easy to install.
Can't quite wrap my head behind that type of idea as a business decision (plus it wouldn't answer other potential problems like the "upgraded" GPU(s) potentially being memory and bandwidth-starved due to the upclock)
+ if oberon is related to the PS5 it also means they need to push it to 2Ghz to emulate what a 48CU's can deliver in graphics perf @1700Mhz, that's what explains the V Devs Kit (2Ghz is a lot), final product WILL NOT run @2Ghz.Because their final chip is not ready and they don't want to lose time when they could use their old chip from old dev kits to do BC regression tests ? I mean there are many reasons if they have more that 1 chip .(if they have )
translated via Google :
in my opinion ps5 and dual gpu chipled with raytracing separate from the gpu connected in parallel to the gpu.
why dual gpu? to better manage the 2 video signals for the psvr2 the 2 gpu can render in separate vr2 redering
secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
Their claim sounds like early (possibly incomplete) information, but true. See my screengrab above.They can issue a DMCA on false information too
I don't know man adding 20 cu is much cheaper compared to dual gpu . We shouldn't go all crazy yet hahaWhy do dual gpus keep popping up? I was in the "bullshit" team but now I'm thinking there might be fire where there is constant smoke.
And how do we even know the exact dates of the tests? The all thing was deleted... do we have someone who has saved the leak?Without the context of the tests it could be anything.
Lets speculate that some senior engineer was given an intern for a few months, and didn't want to spend anytime with him, found an old chip lying around and gave it to the intern to run wild with and not bother him for a few hours each day.
I don't know man adding 20 cu is much cheaper compared to dual gpu . We shouldn't go all crazy yet haha
dual gpu puo essere terribile in ambito pc ma in un hardware chiuso ottimizzato creando bus di collegamento piu veloci delle pci express e tutta unaltra storia
per quanto riguarda il sistema di raffreddamento nn e cosi difficile proggettarlo basta fare consol a sendwich dissipatore che occuba la parte sotto e sopra le periferiche
nn e che ci va uno scenziato termico e poi fatto in alluminio nn costerebbe manco tanto anche perche lefficenza di un dissipatore in alluminio nn e secondo a nessuno con la ventola messa in mezzo tipo ps3 risolvi tutto
...Its industrial design enables us to deliver four times the processing power of Xbox One X...
Which was most likely bullshit at that time too. Unless Sony wants a giant bulky console. Considering the Japanese market and how they don't like bulky electronics, (part of the reason Xbox was shunned) I really don't think their in a place to allow that. In order to bring the size down they'll need to bring the power down. But RDNA should help with that 9TF will be no slouch by any means also taking into considering what Sony did with 1.4TF.
First of all this
+ if oberon is related to the PS5 it also means they need to push it to 2Ghz to emulate what a 48CU's can deliver in graphics perf @1700Mhz, that's what explains the V Devs Kit (2Ghz is a lot), final product WILL NOT run @2Ghz.
- The Dual Chiplet theory is CRAZY, when i read it i was WOOOOOOOOOOOOOOO, the amount of power that would need is beyond 2GHZ, to communicate between chip to chip, travel (pikajul, i think that's the right name) to travel to a point to another needs more power.
-buying addons like a cooler will never happen.
-their business is to make it easier for consumers, not turning it into PC, they might make an easy SSD swap that's it.
So Xbox has 2 chips for Lockart and SeX
And why would sony only look at 1 chip and corner themselves ?
If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.
So Xbox has 2 chips for Lockart and SeX
And why would sony only look at 1 chip and corner themselves ?
If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.
per quanto riguarda il sistema di raffreddamento nn e cosi difficile proggettarlo basta fare consol a sendwich dissipatore che occuba la parte sotto e sopra le periferiche
nn e che ci va uno scenziato termico e poi fatto in alluminio nn costerebbe manco tanto anche perche lefficenza di un dissipatore in alluminio nn e secondo a nessuno con la ventola messa in mezzo tipo ps3 risolvi tutto
What?? Microsoft are rumoured to be making two chipsets because they want to make two different consoles, how is that in any way relatable to Sony using two different chips for one console as per your assumption?So Xbox has 2 chips for Lockart and SeX
And why would sony only look at 1 chip and corner themselves ?
If they had any sense they would work on a low and more expensive offerings and see what transpires with yields and costs. Options are good, no options are not.
Japan should NOT be bothered by thicknessTBF, Sony doesn't seem to be prioritizing the Japanese market anymore. The headquarters was relocated to California a while ago, and they seem to be focusing more on 1st party content that appeals to Western markets. I wouldn't be surprised if that extends to the console design so if the system turns out to be somewhat bulky, they won't care too much if Japan dislikes that because Japan as a whole has been moving away from traditional home consoles for years.
It'd be a much bigger issue if they were developing a, say, PS Vita 2, and it was chunky and bulky as hell. Japan needs to start appreciating the thickness![]()
Dude you got to translate that shit if you want to be understood... or i will start to reply to you in portuguese, percebido?
So what you are saying is they release a 9.2 TF entry model and a 12 TF high end model at the same time? A $400 and a $500 PS5?
Per favore, anche se l'inglese non è la lingua madre di molti qui intorno, il consenso è per tutti di usarlo.secondo me ps5 e dual gpu chipled con il raytracing separato dalle gpu connesso in parellelo alle gpu .
perche dual gpu ? per gestire meglio i 2 segnali video per il psvr2 le 2 gpu possono renderizzare in vr2 redering separati
Yeah there are both sides but you have seen Xbox fanboys' videos like Colteastwood and Dealer Gaming, it's way beyond what Sony fanboys have been doing, but whatever...Honestly, it's going both ways. As soon as the 12TF rumor for Xbox came out, suddenly "insiders" started to question whether it was RDNA even though it was said a million times that it was Navi. Then the rumor of a 9TF PS5 gained some traction From somewhat reputable sources, then suddenly It was 13 TF. Now both consoles are marginally the same, with the Xbox having a slight edge. So sure MS may have some shills but Sony does also. It just depends on which side of the fence you are rooting for. I'm rooting for both. If both console makers make bad ass hardware, my PC will actually be forced to use its muscle.
Depends on how many teams are working on it. I don't think Microsoft would risk not having next-gen graphics on their 12TF machine. I mean, look at the anniversary editions of Halo. You can turn the old and new graphics engines on and off and there are very clear noticeable differences. Ray tracing will definitely help with that as well.