Are these not more detailed versions of the specs released a month ago which are over a year old?
This is just VGLeaks Durango Specs: Hyper-milked edition
Oldest info is from Feb last year. Thats hardly 2 years.
Are these not more detailed versions of the specs released a month ago which are over a year old?
This is just VGLeaks Durango Specs: Hyper-milked edition
Where's that gif of Homer entering and instantly leaving the bar?
Nothing new to see here...
Since when do GPUs require low-latency? They have been using GDDR for years. The argument was the low latency helped CPUs far more than GPUs. Small data sets, AI, physics, etc. Time for new theories I guess.
I think the specialized audio unit hype died down when the PS4 was revealed to have something similar, if I'm not mistaken.
The way each console handles audio could make worlds of difference, but people only seem to care how long their GDDR is.
The info you are referring to is from this year.
Oldest info is from Feb last year. Thats hardly 2 years.
But the information was taken from a document from last year.
I said over a year old, not 2.
I think you need to actually read the article and the second page example, it says more about the system than anything yet released. If correct, this is a gold mine. Of course you could glance at the graph, call it old, and ask for something else you will not read or comprehend.
Seems like an overly complicated design.
Strange how it seems like MS have done a Sony and went very complicated and Sony went in the opposite direction.
Weird.
I always thought it'd make a difference considering numbers thrown around where devs ended up using up to half of the processing power of current gen consoles (or was that 360 specifically?) just to process audio. I think the difference in efficiency could make a difference, even if consoles now are much more powerful than they were in 2005. Maybe 'worlds' of difference is exaggerating, but still.Sony has always had extremely capable audio processors, and I'd imagine MS is doing something competent as well. It really won't make a difference in the end.
Wait what?
RAM quantity?
Last time I checked they both got 8GB
Latency, I was told is not a problem due to cache memory and high bandwidth
The CPUs are basically the same
The PS4 GPU is 50% more powerful than the 720s
I don't see how you come to this conclusion.
The document alluded to last year (the Durango conference stuff) contained none of the information presented here.
So it cannot have been from there.
I always thought it'd make a difference considering numbers thrown around where devs ended up using up to half of the processing power of current gen consoles (or was that 360 specifically?) just to process audio. I think the difference in efficiency could make a difference, even if consoles now are much more powerful than they were in 2005. Maybe 'worlds' of difference is exaggerating, but still.
I think the specialized audio unit hype died down when the PS4 was revealed to have something similar, if I'm not mistaken.
The way each console handles audio could make worlds of difference, but people only seem to care how long their GDDR is.
This isn't new info, it's just a usage example made up by vgleaks. You can apply the same logic they used to any system (given the level of detail that we already had)
Question so the GPU can read both at 170GBs but wouldn't the esram have to be transferred the data from the dram first at 68GBs in 32MB chunks? So even if the you can read both at 170GBs it takes a lot of lower bandwidth steps before it even gets to that point correct?
So what's special about the audio?
that won't stop it, there will be endlessly my console of choice is stronger then yours "nyah nyah aboo boo" type talk lol
It gets boring though doesn't it?
But Durango has more RAM.Wait what?
RAM quantity?
Last time I checked they both got 8GB
Latency, I was told is not a problem due to cache memory and high bandwidth
The CPUs are basically the same
The PS4 GPU is 50% more powerful than the 720s
I don't see how you come to this conclusion.
I never edited the post. Mods can verify this. Perm ban me if I'm lying.I like how this is actual fact now.
Edit: damnit you edited to insert "rumor".
I understand the example is made up, but they used the details which are new to us (beyond the flow chart). What is with the concerted effort to discredit this info? Isn't this what everyone wants, more technical details on the systems?
Not for some. People feel the need to validate their purchases. Both systems will be more than capable of providing gamers with great experiences. The power battle just provides people an opportunity to bicker. People love to bicker.
Why are people hanging on to this like it's gospel? It may be true but, Sony has proven that things change. Everyone seems to accept that they have overclocked their CPU. But then acts like MS is forced to keep theirs at the stock speed.
I always thought it'd make a difference considering numbers thrown around where devs ended up using up to half of the processing power of current gen consoles (or was that 360 specifically?) just to process audio. I think the difference in efficiency could make a difference, even if consoles now are much more powerful than they were in 2005. Maybe 'worlds' of difference is exaggerating, but still.
They can increase their CPU clock speed, but some people suggesting a GPU clock increase would produce much more heat.
You and your logic...![]()
Coming out my sarcastic mode .. the GPU logic on the die would take up (a lot) more space the the Jaguar cores (which are tiny), so it's reasonable to expect increasing CPU core would come at a lower cost than the GPU.You and your logic...![]()
Clearly no, not everyone cares about vgleak's analysis of information we already know (and I don't see how it is based on more than the flowchart when it is literally following the flowchart and deducting some amount for 'real world' estimates). What we want is new data (as in information on the areas we don't know about yet, not some fantasy of new specs). While vgleaks is under no obligation to provide that, it isn't discrediting the article to say that it is uninteresting.
Coming out my sarcastic mode .. the GPU logic on the die would take up (a lot) more space the the Jaguar cores (which are tiny), so it's reasonable to expect increasing CPU core would come at a lower cost than the GPU.
(it was a compliment)![]()
I didnt mean cost in the literal sense but rather from a thermal budget pov.Unless they changed the cooling system there would be no cost increases other than testing.
It's just clocking 12 (much larger) CU's vs 8 small CPU cores would at a higher rate would increase the heat by a bit.
I didnt mean cost in the literal sense but rather from a thermal budget pov.
I'm not knowledgeable enough to discuss this, but I would be shocked if 3rd parties refused to tap into the PS4's extra power for the sake of visual parity.
That seems so dumb to me, I'm for each machine to be made the most of.
Seems like an overly complicated design.
Strange how it seems like MS have done a Sony and went very complicated and Sony went in the opposite direction.
Weird.
Just like the PS3 right?
Just like the PS3 right?
The architectures are comparable this generation. They weren't last generation. The closer the architecture the more likely they will use th advantages of the systems.Just like the PS3 right?
Not an accurate comparison. Devs found the PS3 difficult to develop for from the word go, so any additional optimization was likely out of the question unless it was the one console you were developing for (As in, Sony owned studios who did in fact show off some of that extra power).
However, whether or not devs take advantage of any additional PS4 power, or whatever ends up being more powerful for that matter, is still an open question none of us can answer. I certainly wouldn't take it as a given that they would.
So with this audio chip can we expect the New Xbox games to feature DTS and/or LPCM 5.1/7.1 audio?
I'm not too sure I think there was a rumor somewhere saying there was, but I have no idea of the source.Also, is there anything in here to suggest that Microsoft has an answer to Sony's Share Button?
There were a number of studios that made good use of the PS3, beyond first party like Valve, Criterion and Visceral...
But Durango has more RAM.
Durango: 8192 MB DDR3RAM + 32 MB ESRAM = 8224
PS4: 8192 MB GDDR5 = 8192
8224 > 8192
Similarly, Durango has same amount of bandwidth as PS4; 68 + 102 = 170 GB/s.
Oh and ignore the PS4 CPU clock bump rumor to 2GHz. Also ignore the latency cycles.
The PS4 GPU is not 50% more powerful but the Durango is 33% less.
I think I covered everything.
The notion behind the share button (and some of the other features that stem from it) seems incredibly important to me as a community building tool, a development from cross game/party chat. So much so I figured in the build up to these reveals that Microsoft would the one to do it rather than Sony, as a natural extension of XBL.Yes. They also have the storage capabilities (50gb disc).
I'm not too sure I think there was a rumor somewhere saying there was, but I have no idea of the source.
Indeed. Those examples were few and far between though.
Are you trolling here? Because you cannot simply add numbers together.
That's like saying 2 bathrooms in your house let you poop twice as fast.
Are you trolling here? Because you cannot simply add numbers together.
That's like saying 2 bathrooms in your house let you poop twice as fast.
But Durango has more RAM.
Durango: 8192 MB DDR3RAM + 32 MB ESRAM = 8224
PS4: 8192 MB GDDR5 = 8192
8224 > 8192
Similarly, Durango has same amount of bandwidth as PS4; 68 + 102 = 170 GB/s.
Oh and ignore the PS4 CPU clock bump rumor to 2GHz. Also ignore the latency cycles.
The PS4 GPU is not 50% more powerful but the Durango is 33% less.
I think I covered everything.
Seems like an overly complicated design.
Strange how it seems like MS have done a Sony and went very complicated and Sony went in the opposite direction.
Weird.
Don't get me wrong the primary purpose of the fast memory pool is to increase the overall bandwidth they had to add a fast pool the moment they decided on DDR3. Having said that they selected a low latency solution because they saw value in it.
Yes GPU's have caches, they question becomes how effective they are. It's hard to quantify without running a lot of tests on a lot of existing titles.
The system supports rendering to either memory pool, I suspect any real renderer would render to both inside a frame, there is the issue of how much data copying you end up doing, the DME's are there for a reason, but it all eats bandwidth.
PRT's make it feasible to know pretty much exactly what parts of what textures were actually used in the last frame, but you still have to use that knowledge effectively.
As I've said before the split memory/Bandwidth/ROP count would still be the things that concern me most in the design, but I wouldn't judge anything without actually using it.
My guess is the quantity of memory was important early in the design which dictated DDR3, which dictated the fast memory pool, statistical data and manufacturing complexity probably indicated using SRAM instead of eDRAM.
Sony matching the 8GB is a big deal IMO.
Only reason for it - if genuine and current - must be cost reduction.
yeah, i'm curious to see launch prices. Sony has a bigger opportunity for post release price reduction, but they are going to be starting from a higher base