VGleaks: Orbis Unveiled! [Updated]

Anyway, I think it's kind of funny that we actually in some senses know slightly less about Orbis's setup now than we did before this latest vgleaks.

The question over the '14+4' and whether that simply means 4 are tweaked in a different way, or 4 live in a distinct world away from the other 14...has some non-negligible repercussions in terms of different tradeoffs and design goals. And we don't have a solid answer yet.

Could it be possible that for final hardware those 4 CUs won't even be there, and will be replaced by silicon that is specific to the task?
 
llherre already said both consoles will be a wash in power.

You won't hear anything from Arne and Stinkles unless you want to see them executed by their employers.

Terminated, as in their employment contracts, would be a more apt description. Realistically i can't see this speculated "wash"/"similitude" in power. If these specs are final then the psnext is without a doubt more powerful however, the war for supremacy becomes a battle for efficiency once they've launched and whomever leads that is the one who reaps the most graphical benefits
 
From HC24, trinity floorplan.

http://www.hotchips.org/wp-content/...yCore/HC24.28.325-Trinity-Nussbaum-AMD-v2.pdf
http://youtu.be/CS-POoiCOQo

Jaguar apu

http://www.hotchips.org/wp-content/...roprocessor/HC24.28.120-Jaguar-Rupley-AMD.pdf
http://youtu.be/_GXA38vFPXA

Source: http://www.hotchips.org/archives/hc24

edit: I don't think anything will be partitioned off. Two dual module piledriver is one quadcore, which is around two quad core jaguars... then you add the Graphics part on the side. And that's Amd's job done.
 
Could it be possible that for final hardware those 4 CUs won't even be there, and will be replaced by silicon that is specific to the task?

I would doubt it. It seems they're presented as units that are in some way more optimal for GPU compute jobs in general, and if they were to be replaced you'd need compatible hardware to run developer's compute jobs. Not sure what more 'specific' silicon they could or would use.
 
Think that's the standard GCN CU layout...

Yes it seems that way what makes it even more stupid if 4 regular CUs just have been seperated but I have the feeling that the leaks don't come from people who understand much about what they post anyway.

Only thing I could think of would be a different prediction unit and a modified scheduler to accomplish general tasks a little bit better. Not really sure but I always thought that AMD GPU CUs are not preemptive - so maybe those 4 CUs are.
 
Let me help you guys out who can't comprehend what a wash in power means.

PS3/360 power gap.

PS3_044.jpg.jpg

360_044.jpg.jpg


PS2/Xbox power gap.

wreckless_screen011.jpg

wreckless_screen014.jpg
 
Yes it seems that way what makes it even more stupid if 4 regular CUs just have been seperated but I have the feeling that the leaks don't come from people who understand much about what they post anyway.

Only thing I could think of would be a different prediction unit and a modified scheduler to accomplish general tasks a little bit better. Not really sure but I always thought that AMD GPU CUs are not preemptive - so maybe those 4 CUs are.

I think that's the suggested motivation for separating them out - so they're on a separate scheduling regime, perhaps one with more programmer input. Which might be better (from a potential utilisation/efficiency/performance POV) than mixing them in on the regular GPU scheduler, going by AMD's own presentations anyway (pre 'full' HSA, pre-emptive, context switching chips, at least).

These are guesses. But still, we're left with more questions than we thought we had before this leak :)
 
Just out of curiosity, how many of you actually use BC? Am I the only one who wouldn't care if PS4 doesn't play PS3 games? I'll just move my PS3 to my bedroom, and the PS4 will take it's place in my mancave.

This next generation transition is different from any other past generation transition. Users built huge digital libraries through the PS3 life cycle. Those PSN digital games are not resalable and if they are only playable on PS3 then they are doomed to almost immediate obsolescence as soon as the user buys a PS4. And Sony if choses not to include BC to those titles it ignores the advantage of having an instantaneous library, at no extra cost to the user or Sony (barring bandwidth/servers costs) available in the start of a hardware generation that can prove very challenging in terms of software offerings.

I wasn't much of a user of BC in generations past but the accessibility of the vita changed all of that. I've got more PSP games on the thing than vita games (i've even double dipped some PSP games i already own) purely because the accessibility so high and the prices are so low e.g. GTA LCS, VCS and CTW all for 15$ - how could i resist. BC like that is a dangerous thing when your credit card and PSN cards are so easily on hand
 
I would doubt it. It seems they're presented as units that are in some way more optimal for GPU compute jobs in general, and if they were to be replaced you'd need compatible hardware to run developer's compute jobs. Not sure what more 'specific' silicon they could or would use.

Well then it doesn't make sense to "reserve" these CUs. Sony should just let the software pre determine how that works.

It's true that there will be no context pre emption in Orbis, but I think by coding to the metal you can pre assign however many CUs you want for GPGPU purposes.
 
This next generation transition is different from any other past generation transition. Users built huge digital libraries through the PS3 life cycle. Those PSN digital games are not resalable and if they are only playable on PS3 then they are doomed to almost immediate obsolescence as soon as the user buys a PS4. And Sony if choses not to include BC to those titles it ignores the advantage of having an instantaneous library, at no extra cost to the user or Sony (barring bandwidth/servers costs) available in the start of a hardware generation that can prove very challenging in terms of software offerings.

Backwards compatibility is also good for people who didn't own a previous system. So PS4 can be appealing to 360 owners who choose to switch in the next generations as they get a chance to try out all the PS3 exclusives they missed out on. For me personally one of the main reasons I bought a Wii near launch was because it was backwards compatible with GameCube, I'd get to try out some great gems I missed out on. Sony should try and steal as much as 360 customers as possible, so far it looks like PS4 is more powerful according to leaks, free online, PSN+, and more exclusives and finally backwards compatibility would be a good feature for 360 owners.
 
Ohh, we are on BC now? I dont give a FK. I understand ppl want it, hell i do too. Its not going to stop me from getting the PS4 though. I dont even know if I want it tbh. I have a shit ton of crap on my PS3 already. I am fine with it there, I dont want to be moving all that crap over.
 
Found Trinity floorplan..

TRINITY-8.png


Replace DDR3 with GDDR5 or whatever is the memory controller they are going for.
Note apu/cpu/gpu performance is improved with more bandwidth.
Replace northern islands with Southern Islands. Either 7000 series or 8000 series.
Replace the two dual core modules with two quad-core jaguars.
 
Gah, my buddy was teasing me about some stuff he "knew" concerning next-gen gaming. I finally ran into him this past weekend and it turned out to be some stuff related to NVIDIA.

I was kind of excited at the possibility of grasping some details to share on GAF but I guess that won't be happening anytime soon. Everyone I know from SCEA is being tight lipped too, although I'm meeting with one of my old buddies (a lead tester) soon!
 
All the CUs have the same texture sampling hardware. If a CU didn't have texture units it wouldn't be able to read data D: (I think!) And the CUs aren't prescribed/reserved for any one particular task even if they are tweaked or arranged optimally for that task...

The leaked info is pretty clear that all 18 CUs have texturing hardware anyway.

Well it wouldn't make sense to have CUs that are reserved for physics to be doing texturing. They might have texturing hardware but they won't be doing any texturing which makes it kind of a number on paper that don't mean anything in real life.
 
yes, having a choice is a terrible annoyance.

At the expense of what? 100 more being tacked on to the console cost? I understood the PS2 to PS3 outrage but with a 500GB hdd full of mandatory installs PS+ games, downloadable content etc etc. Im good with where it is. Last thing I want to do is bust open my PS4 and fill my new machine with all my old crap thats perfectly playable on my PS3.

If they had a add on thats being talked about on here...fine. Jumping through hoops and adding cost on the console so I can throw a PS3 disc into my PS4...im good.
 
Found Trinity floorplan..

TRINITY-8.png


Replace DDR3 with GDDR5 or whatever is the memory controller they are going for.
Note apu/cpu/gpu performance is improved with more bandwidth.
Replace northern islands with Southern Islands. Either 7000 series or 8000 series.
Replace the two dual core modules with two quad-core jaguars.
You need to account for the increase in die area for the increase in CUs as well.

Plus you can remove some redundant logic and some like PCIe etc.
 
At the expense of what? 100 more being tacked on to the console cost?
it's not obvious how much if anything it would add to the cost

I understood the PS2 to PS3 outrage but with a 500GB hdd full of mandatory installs PS+ games, downloadable content etc etc. Im good with where it is. Last thing I want to do is bust open my PS4 and fill my new machine with all my old crap thats perfectly playable on my PS3.
There is nothing FORCING you to move your stuff over, if you're fine having two machines running that's your perogative, but I personally don't trust my current PS3 and in X number of years when it dies I still want to be able to play the software I own and without BC I won't be able to unless I find a used system somewhere.

It's clear that Sony needs to do something to encourage people to stay and continue consuming content on the PS4 and BC would do that. If Sony doesn't include even some rudimentary BC on the PS4 and Microsoft does with Durango you can bet that a LOT of people who supported both systems from a digital download point of view will no longer support sony going forward, myself included.
 
Well then it doesn't make sense to "reserve" these CUs. Sony should just let the software pre determine how that works.

It's true that there will be no context pre emption in Orbis, but I think by coding to the metal you can pre assign however many CUs you want for GPGPU purposes.


I'm not sure. There's a hardware balancer in GCN GPUs and I don't think it has any, even low level, exposure for the programmer.

The argument for having two distinct groups of CUs in a pre-HSA world is made in an AMD presentation, although I can't put my finger on it right now... I'm not at all sure this is the route Sony's gone, but it is a possibility.

Well it wouldn't make sense to have CUs that are reserved for physics to be doing texturing. They might have texturing hardware but they won't be doing any texturing which makes it kind of a number on paper that don't mean anything in real life.

The CUs can be used for rendering too though. Why remove that use-case? I'm also not sure that in compute jobs, data reads aren't done through the texture samplers. And there's plenty of data reads in all kinds of compute jobs. The leaked info is clear, anyway, that they have that hardware.
 
Well then it doesn't make sense to "reserve" these CUs. Sony should just let the software pre determine how that works.

It's true that there will be no context pre emption in Orbis, but I think by coding to the metal you can pre assign however many CUs you want for GPGPU purposes.

Is it possible that there is a second scheduler on the GPU for those 4 CUs? So that way graphics would run on 1 and the compute would run on the other. So if a game ran a lot of compute heavy stuff it would not gum up the scheduler if there was only 1?
 
it's not obvious how much if anything it would add to the cost

You clearly have no idea what actual price of hardware is consisted of. Each new chip is a cost. And when you add new hardware whole architecture will be more complicated and much harder to scale it in future.

Reason why Sony and X360 are using AMD is that will mean lower price and in future faster cost reduction. They will earn a lot faster money on hardware and will lower their price much much faster than X360/PS3 gen.

With this hardware design i don't expect it to cost more than 400$ day1. Can't say really about durango since it will have probably build in new Kinect which will drive price up and probably that is the reason why it is only 1,2 not 1,8 like Sony (to keep price low)

jrMfsy4G45uFT.png


I was saying that all along. Efficiency will be like a lot better than people assume just from Flops (example of 680 vs consoles)
 
jrMfsy4G45uFT.png


Not a good message to PC GAF.

But yeah. Popcorn.

Why? The whole reason PC GAF loves the platform so much is that it isn't defined by generations. Even if next gen consoles would have better performance than a good gaming PC today (I would love to see that), they would still be surpassed by PC in a year.
 
Well because I tell someone who expected to see GCN2 in next gen consoles that Durango might have a GCN2 gpu according to rumors, somebody else seems to become bothered by this rumor, you tell me that GCN2 won't bring anything but a minimal difference and then somebody else comes in to throw a "but the sauce" joke, trying to imply that in the course of this discussion somebody tried to bring out the "durango has secret sauce" card.

Really, it stinks of insecurity.

Oh boy.
 
I'm not arguing anything . I know Wii u is weaker but the "real next gen" consoles were supposed to be like the leap from Wii - ps3 but its clear its MUCH closer this time around

Is it really? Once again it depends on how you compare the two generations. On a pure power standpoint or flop count its seems like its closer to the Wii-PS3/360 comparision then you may think. I dont know how many flops the Wii had, but once again according to these rumors, the "compute portion" of Orbis is almost just as powerful the ENTIRETY the whole Wii U system by iteself(410gflops)! The Wii U GPU is speculated to be about ~400gflops, and I know the CPU is WAY less than 100gflops(~40gflops?).

Now the flop comparisision isn't fair when comparing it to last generation flops. Once again its not apples to apples comparision. FLOPs don't describe everything, or even 50%.

The reason why last gen Wii-PS3 will be a bigger difference is because of fundemental feature differences that are not as big this time around. Mainly Wii didn't have programmable shaders, but PS3 and 360 did!

There will still be some feature differences, because Wii U GPU has a feature set comparable to DX10, while 720/PS4 will have a featureset comparable to DX11.1. How signifigant those difference will be I dont know because I'm not knowledgable enough, but I'm told its not nearly as much as it was with programmable shaders NOT being POSSIBLE on the Wii. Please someone correct me if I'm wrong.
 
Why? The whole reason PC GAF loves the platform so much is that it isn't defined by generations. Even if next gen consoles would have better performance than a good gaming PC today (I would love to see that), they would still be surpassed by PC in a year.

I was told that the next gen consoles would be comparable to mid range PC power on release. A comment like that would put them in the high end.

Of course, this just means PC GAF would have to upgrade their cards which is absolutely normal.
 
Why? The whole reason PC GAF loves the platform so much is that it isn't defined by generations. Even if next gen consoles would have better performance than a good gaming PC today (I would love to see that), they would still be surpassed by PC in a year.

Because a lot of them thinks that when consoles don't have 680 in them they won't do Agni 60fps@1080p

And you are right in year probably depending on gengre we will see games looking better on PC. Also it should be noted that "looks" of the game is not just better res.

Also this tweet is preety exciting considering there is still some things which can change a lot next gens.
 
Would it be a matter of greater hardware performance or the benefits of a closed system allowing developers to take greater advantage of the performance available to them? If it was currently possible to achieve greater performance with significantly less power consumption and lower costs, why wouldn't that kind of technology be rolled back into AMDs other products?
 
Because a lot of them thinks that when consoles don't have 680 in them they won't do Agni 60fps@1080p

And you are right in year probably depending on gengre we will see games looking better on PC. Also it should be noted that "looks" of the game is not just better res.

Also this tweet is preety exciting considering there is still some things which can change a lot next gens.

Also does that comment really translate to the flops not meaning jack shit in console power?
 
We've known Pitcairn's die size for a while; http://www.neogaf.com/forum/showthread.php?t=465532

Also the die shot for Picairn/Cape Verde is not accurate.

Out of curiosity, how much do you think an apu consisting of jaguar 8 core and underclocked pitacairn, both custom, will cost?

I think about half of £399, less than £200, maybe even £150 to £175...

Big ass apu though. Nothing on their road map has an 8 core jaguar variant. I hope it's an 8000 variant rather than a 7000, just cause every little helps. Regardless, there's nothing like it on the market.
 
I was told that the next gen consoles would be comparable to mid range PC power on release. A comment like that would put them in the high end.

Of course, this just means PC GAF would have to upgrade their cards which is absolutely normal.

Because a lot of them thinks that when consoles don't have 680 in them they won't do Agni 60fps@1080p

And you are right in year probably depending on gengre we will see games looking better on PC. Also it should be noted that "looks" of the game is not just better res.

Also this tweet is preety exciting considering there is still some things which can change a lot next gens.

I agree. Even though I'll probably stay on PC for a while when the next generation hits, I would love to see the consoles capable of running games superior to PC versions. This would make the PC as a platform move forward as well.

By the way, who is that tweet from and why should we trust him?
 
Would it be a matter of greater hardware performance or the benefits of a closed system allowing developers to take greater advantage of the performance available to them? If it was currently possible to achieve greater performance with significantly less power consumption and lower costs, why wouldn't that kind of technology be rolled back into AMDs other products?

Part of it has to do with drivers and APIs on PC vs console (see Timothy Lotte's blog about Orbis & Durango for some interesting detail), and part of it has to do with features you can implement on a console that you couldn't get away with on PC - or that wouldn't be supported generally on PC because of the fragmented gpu landscape. For example, eDRAM or eSRAM on a GPU - the reason it's not done on PC is at least partially because it requires programmer care. You can lean on programmers in a console box to make specific use of your specific features...not so much on PC where everything has to be abstracted away behind an API. You're unlikely to want to code a renderpath just for one PC GPU that might offer eDRAM.
 
I agree. Even though I'll probably stay on PC for a while when the next generation hits, I would love to see the consoles capable of running games superior to PC versions. This would make the PC as a platform move forward as well.

By the way, who is that tweet from and why should we trust him?

He is a journalist that pops up a lot in the durango threads. He seems to imply that he has docs but never shares anything until another website reveals it and then only confirms that either it is similar to what he has or some parts are missing.

Previously, he said Durango over Orbis by a big margin, but now the gap is a wash. Also previously, he has said that he has heard more about Durango from Devs, and little about Orbis.
 
Also does that comment really translate to the flops not meaning jack shit in console power?

No it just states what i was talking about. You can't compare modular PC hardware to closed console box. Flops may be lower in consoles but it can be used better and they can get close to their theoretical flop power. Pc on other hand can have much bigger Tflop power but it can struggle to reach that theoretical power.

I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API....

That just one quote and there are more things that he mentioned
 
That's less than 7900 series.

Dont forget, developers think about frames, not seconds. Divide 1.6 billion triangles by 60, and you will get 26.6 million triangles per one frame in a 60fps game.

How does PS3/360 compare to these numbers? I thought I remember reading dev interviews for Uncharted 2 saying they were pushing about 20million trianlges per frame.(which was at 30fps). Once again this is off old memory, and they could of said 2million, or 20million polys, i dont know.
 
No it just states what i was talking about. You can't compare modular PC hardware to closed console box. Flops may be lower in consoles but it can be used better and they can get close to their theoretical flop power. Pc on other hand can have much bigger Tflop power but it can struggle to reach that theoretical power.



That just one quote and there are more things that he mentioned


In other words, even on that 680 running Watch Dogs. It wasn't running in peak performance.
 
He is a journalist that pops up a lot in the durango threads. He seems to imply that he has docs but never shares anything until another website reveals it and then only confirms that either it is similar to what he has or some parts are missing.

Previously, he said Durango over Orbis by a big margin, but now the gap is a wash. Also previously, he has said that he has heard more about Durango from Devs, and little about Orbis.

Alright, that clears things up a bit.
 
Out of curiosity, how much do you think an apu consisting of jaguar 8 core and underclocked pitacairn, both custom, will cost?

I think about half of £399, less than £200, maybe even £150 to £175...

Big ass apu though. Nothing on their road map has an 8 core jaguar variant. I hope it's an 8000 variant rather than a 7000, just cause every little helps. Regardless, there's nothing like it on the market.

$200 would be my guess. Depending on the initial yields.
 
Ohh, we are on BC now? I dont give a FK. I understand ppl want it, hell i do too. Its not going to stop me from getting the PS4 though. I dont even know if I want it tbh. I have a shit ton of crap on my PS3 already. I am fine with it there, I dont want to be moving all that crap over.

A lot of people have only enough space for one console in their Tv rack. Me, too. I have a lot of not completed PS3 titles. If I buy a PS4 - I probably will never use my ps3 again. With BC I'd keep the best and exlusive games and the not completed titles and sell the console with 15 games for 200€. In that case - PS4 Day one.

With no BC I probably wait a year or two, so I can clear my backlog and the console gets cheaper. Still have a pc, so no big problem. Plus I'm used from steam, that my bought content remains playable. It's a big let down, that I can't play those PSN titles anymore. Will probably buy less via PSN in the future then.
 
Top Bottom