VGleaks: Orbis Unveiled! [Updated]

No. That's a myth. That story began at the start of the PS360 generation and it's wrong.

Sony announced they have been collaborating with Nvidia on a GPU for the next playstation in 2004, two years before it launched. That's not last minute. All console companies come up with multiple designs. It's true they had thought about creating their own GPU or using a Cell as a GPU, but I think they realized it would be unfeasible pretty early. The whole story is really BS that I think was started to make an explanation for why Sony's console had the weaker GPU, since Xenos was obviously superior from the announcement of the specs.

that is pretty last minute.
 
There is a huge difference between an OS kernel based on Win 8 and it being stock Windows 8

Of course there is. I haven't said that the Xbox will just run native Windows 8.

But some people are arguing that its OS will have nothing in common with Windows, when it will obviously share the same core kernel. As per Windows Phone 8.

The goal will surely be to allow for easy porting of Windows 8 apps over to Xbox. It would make even more sense if app rans natively ran on both, but I don't know how likely that is.
 
They don't need anywhere near the rumored amount of RAM to do all of that.

They only need 2-3GB reserved if it's running some kind of Windows OS, which it obviously will since MS wants to unify all their products. The Xbox isn't going to be the odd man it, expect Windows branding all over the place.

Nonsense, nobody knows what features they have planned for Durango so you can't reasonably claim that. Besides, if they could achieve all that in a lot less RAM, as you claim, then it would make no sense for them to put Windows 8 there, just for the sake of it. A full general use OS designed for modular machines makes absolutely no sense on a console, no matter the complexity of its planned multimedia capabilities. Unless for some reason you believe Microsoft have suddenly turned into a bunch of clueless idiots.
 
No. That's a myth. That story began at the start of the PS360 generation and it's wrong.

Sony announced they have been collaborating with Nvidia on a GPU for the next playstation in 2004, two years before it launched. That's not last minute. All console companies come up with multiple designs. It's true they had thought about creating their own GPU or using a Cell as a GPU, but I think they realized it would be unfeasible pretty early. The whole story is really BS that I think was started to make an explanation for why Sony's console had the weaker GPU, since Xenos was obviously superior from the announcement of the specs.

ummmm you do realize the ps3 was announced in 05 which would coincided with that being last minute, your "myth" just reinforced the point.....
 
that is pretty last minute.


Nonsense. Because the analogy would be if one company had announced back in 2011, they've been collaborating with AMD on a next gen GPU. You're implying to not be too "last minute" Sony and MS would have to have their design choices nailed down for this upcoming gen back in 2010, or earlier. I think that's ridiculous on its face.
 
No. That's a myth. That story began at the start of the PS360 generation and it's wrong.

When people say 'last minute' they don't mean weeks before the PS3 started production. But a console would be in the R&D and planning stage for years before it is released. The fact is the original design for the PS3 didn't include a 3rd party GPU and one was only included when the in-house GPU didn't come to fruition.

There's a book by one of the engineers that worked on the PS3 that covers this in detail. I've tried to find it in Amazon, but I can't remember the name of it.
 
ummmm you do realize the ps3 was announced in 05 which would coincided with that being last minute, your "myth" just reinforced the point.....

Explain yourself more clearly. Your post makes no sense to me. I don't understand the point you're trying to make. Sony announced they were working with Nvidia before the PS3 was even announced.
 
Probably a slew of new features, including desktop OS style multitasking, and maybe even capability to run Windows Mobile OS, the apps and games, to tie in with the phones, XBL cross platform, mobile and desktop eco system etc.

^This. As well as probably reviving your lost pet Fluffy, making your waffles(but no pancakes), and doing your homework for you. Oh and maybe some Kinect features too...

IMO, when MS reveals this system they'll show off some non releated gaming feature that will surprise or blows some minds. It will start a serious debate if Orbis is done out of the gate and therefor should of done the same and then the other camp will debate that it wasnt the worth risk and big performance lost for games.

The debate will end in a statemate, with some saying, "if gaming is your main focus- Get a PS4. If you want to do a ton of cool things as well as some gaming- get a 720."

Then other even more reasonable people to say its to early to predict the impact of this and say which was the better route. come back in 5 years!
 
a console would be in the R&D and planning stage for years before it is released.


They announced they have been collaborating with Nvidia in 2004. I think that qualifies as still being in the planning stage. I've read before on B3D it takes at least 2 years to design a console from scratch.


The fact is the original design for the PS3 didn't include a 3rd party GPU and one was only included when the in-house GPU didn't come to fruition.

That fact does not prove that RSX was last minute. As I've said, each one of these console companies has multiple design concepts.

RSX is integrated into PS3. Whoever said earlier it was tacked on is speaking from ignorance.
 
:] so they just spend ton of money "just because ?" Sorry your story doesn't hold up.
They spent a lot of money just because. A $800+ console to make a launch can tell you that.


Nonsense. Because the analogy would be if one company had announced back in 2011, they've been collaborating with AMD on a next gen GPU. You're implying to not be too "last minute" Sony and MS would have to have their design choices nailed down for this upcoming gen back in 2010, or earlier. I think that's ridiculous on its face.
Nobody is announcing early this round. Sony would have been working with AMD since the PS3 launched.
 
I think the 'last minute' rumor for RSX stems from the fact that it was decided upon later than most console parts usually are and it was downgraded big time shortly before launch.
 
They announced they have been collaborating with Nvidia in 2004. I think that qualifies as still being in the planning stage. I've read before on B3D it takes at least 2 years to design a console from scratch.

Development started on Cell in early 2001. It would have been done by Sony with the PS3 clearly in mind, so gives a public date for when planning for the PS3 commenced. Nvidia was announced as supplying the GPU in December 2004, almost 4 years after Cell development had begun.

I think the 'last minute' rumor for RSX stems from the fact that it was decided upon later than most console parts usually are and it was downgraded big time shortly before launch.

That's exactly what 'last minute' means in regards to RSX.
 
^This. As well as probably reviving your lost pet Fluffy, making your waffles(but no pancakes), and doing your homework for you. Oh and maybe some Kinect features too...

IMO, when MS reveals this system they'll show off some non releated gaming feature that will surprise or blows some minds. It will start a serious debate if Orbis is done out of the gate and therefor should of done the same and then the other camp will debate that it wasnt the worth risk and big performance lost for games.

The debate will end in a statemate, with some saying, "if gaming is your main focus- Get a PS4. If you want to do a ton of cool things as well as some gaming- get a 720."

Then other even more reasonable people to say its to early to predict the impact of this and say which was the better route. come back in 5 years!

If they could leverage the OS and ram at features that benefit the social aspect of online gaming (ala something like Party Chat) and expand features to promote playing and interacting with people on your friends list, the visual tradeoff might be worth it... at least for me.
 
IIRC, RSX is actually larger than Xenos' logic. So yeah, Sony didn't purposefully ask for a weak GPU. Nvidia was just a year behind AMD with developing their unified shader tech. PS3 was originally supposed to launch before it was ready. They did hurt Sony by selling them a design with a bug that screwed up the scaling. AMD hit a homerun for MS on 360. I'm happy Sony went AMD this round. Also it's a myth Sony went with Nvidia at least minute instead of dual Cell sans GPU. That was only a very early proposal.

Correct. MS was just lucky that AMD already had already done some work with unified shaders with the R300. Xenos was a year ahead of AMD's PC line of cards as well. So I wouldn't blame NVIDIA for how RSX turned out.

I also agree that the NVIDIA deal wasn't last minute.

Of course there is. I haven't said that the Xbox will just run native Windows 8.

But some people are arguing that its OS will have nothing in common with Windows, when it will obviously share the same core kernel. As per Windows Phone 8.

The goal will surely be to allow for easy porting of Windows 8 apps over to Xbox. It would make even more sense if app rans natively ran on both, but I don't know how likely that is.

It may be true but it's largely pointless to single out. Was it worth mentioning what kernal the OS in the OG Xbox or 360 is based on?

Easy porting could very much be relative to how low level the Durango API is. PCs and mobile/tablet games run on a thicker abstraction layer, so porting to and from Durango may not be so simple. Apps are a different story though imo.

ummmm you do realize the ps3 was announced in 05 which would coincided with that being last minute, your "myth" just reinforced the point.....

The year a system is announced has little to do with the timing of any contracts made. Not sure how a 2005 announcement indicates the deal was last minute. The hardware in general barely existed when the system was even first revealed.
 
We need some kind of chart explaining how it might be easier to empty a regular bath tub with a bucket than a huge jacuzzi with a regular glass. Or something.

Unless the difference in speed is at least a factor of two, then it's not going help it in the long run. Microsoft could, in theory, swap out their OS whilst keeping the API mostly the same.

From the "leaked" Durango specs, it's bandwidth is apparently 170 GB/s compared to Orbis' 176 GB/s. 6 GB, while somewhat significant, isn't going to tip the balance towards it. Disk access times are a huge factor in getting any of that data up there. Other than that, there are very few ways to mitigate the halved capacity of memory (like using those extra CUs for procedural content). At the moment, I'm of the opinion that this is major detriment to Orbis.
 
Cell dev-units existed in 2004, as did the RSX-predecessor hardware.

Interesting, I've read some of the demos (such as FNR3) wasn't running on actual hardware and dev kits were limited. Not sure if it's true, but I just assumed hardware was very early/non-existent. Thanks. =)
 
Unless the difference in speed is at least a factor of two, then it's not going help it in the long run. Microsoft could, in theory, swap out their OS whilst keeping the API mostly the same.

From the "leaked" Durango specs, it's bandwidth is apparently 170 GB/s compared to Orbis' 176 GB/s. 6 GB, while somewhat significant, isn't going to tip the balance towards it. Disk access times are a huge factor in getting any of that data up there. Other than that, there are very few ways to mitigate the halved capacity of memory (like using those extra CUs for procedural content). At the moment, I'm of the opinion that this is major detriment to Orbis.

So you just added 102 to 68 and called it a day.... interesting.

IL+3
 
Correct. MS was just lucky that AMD already had already done some work with unified shaders with the R300. Xenos was a year ahead of AMD's PC line of cards as well. So I wouldn't blame NVIDIA for how RSX turned out.

I also agree that the NVIDIA deal wasn't last minute.


Right. Probably not many people know, AMD was rumored to have a PC unified shader GPU developed before Xenos (aka R500), the "R400". I guess the performance wasn't there yet and it didn't go into production. So they made the R420 and R520 which are built from the R300 (9700Pro) tech. I'm sure that earlier R400 research helped prepare them when they were designing Xenos.
 
I don't think they will be using the entire pitcairn.

No shit Sherlock!
But which parts will be removed and how many transistors do you think 8 Jaguar cores require?

The real measurable performance increase is negligible. XDR is as fast as DDR3 derived ram but DDR3 didn't really improve in DDR2's perceived performance.

Ok, take a look at the top XDRAM specs on this page
http://www.elpida.com/en/products/xdr.html

and then post a link to some compareble DDR3 specs.

If you can´t do that, please write a comprehensive apology to this thread for spreading dumb shit.
 
Right. Probably not many people know, AMD was rumored to have a PC unified shader GPU developed before Xenos (aka R500), the "R400". I guess the performance wasn't there yet and it didn't go into production. So they made the R420 and R520 which are built from the R300 (9700Pro) tech. I'm sure that earlier R400 research helped prepare them when they were designing Xenos.

Oops you're right, xenos is based on the R400, not R300. I got that backwards.
 
Funny quotes from Proelite on B3D

Sweetvar26 also said that AMD felt the Xbox was a super computer and was "more" powerful. In hind sight, I think he mistakenly switched the platforms.

From the looks of it, Orbis fits the super computer descriptor better.

This comment is from second half of 2012. AMD already knew the ram expansion from 2 to 4 gb, as it already happened.

It's more likely Sweetvar26 go the two consoles mixed up.

Changing his tune yet again...is it sinking in to him that Orbis is more powerful? Or did a source correct him and tip him off?
 
Charlie from semiaccurate said that developers are unhappy with the underpowered specs from both consoles compared to what they though they were gonna get. I really think people should stop hoping for some hidden secret that will blow our minds. What you see is what you get. It's just some pc hardware, slightly modified and more efficient with HSA which is coming to PCs later on.

At this point, if we get all the rumoured peripherals included in every sku, then it's not so bad.
 
Charlie from semiaccurate said that developers are unhappy with the underpowered specs from both consoles compared to what they though they were gonna get. I really think people should stop hoping for some hidden secret that will blow our minds. What you see is what you get. It's just some pc hardware, slightly modified and more efficient with HSA which is coming to PCs later on.

At this point, if we get all the rumoured peripherals included in every sku, then it's not so bad.

If you look at the current gen console hardware, and how the games look on them, and then look at the specs on the next gen consoles, it's a pretty nice leap. It doesn't mean a damn thing how it compares to conventional PC hardware.
 
I'm a bit annoyed that so far it looks like neither Microsoft nor Sony have gone with a GCN 2 GPU. I think that's the least they could have done tbh. I mean, I appreciate the 4GB GDDR5 Sony put in, but come on...
 
I'm a bit annoyed that so far it looks like neither Microsoft nor Sony have gone with a GCN 2 GPU. I think that's the least they could have done tbh. I mean, I appreciate the 4GB GDDR5 Sony put in, but come on...

Marginal, insignificant update, and probably not worth the added yield issues since it's not as established as GCN
 
Charlie from semiaccurate said that developers are unhappy with the underpowered specs from both consoles compared to what they though they were gonna get. I really think people should stop hoping for some hidden secret that will blow our minds. What you see is what you get. It's just some pc hardware, slightly modified and more efficient with HSA which is coming to PCs later on.

At this point, if we get all the rumoured peripherals included in every sku, then it's not so bad.

For the millionith time no its not! Theres a heavy amount of customization going on here with the Liverpool APU, the GPU, and the architecture, but it even goes way beyond that. Drek explains it well here, other people have even gone into more detail on this subject, but I cant find the posts right now and dont want to page through 40 pages right now.

You really can't. the Orbis is supposedly using a GPU based off of ATi's 7950 laptop series, but the additional tweaking they're giving it plus the much more streamlined interface it will have with the CPU and system memory plus the ability for devs to code more closely to the metal makes comparisons near impossible.

Your going to be able to do things on this hardware that you would never be able to do with 7850 GPU in a PC. Hell even a 680 wont be able to achieve what Orbis and Durango will be able to do 4 years down the line when the devs REALLY start to maximize the hardware.

For Orbis we are looking at a GPU with ~8-10X RSX performance, ~8-10X bandwidth, ~10X PPU performance, and 8X the RAM.

Not to mention the fact these consoles will be way more efficient at using there power that is 8-10x more on paper. Not to mention the additional graphical features like tessilation,ect. In real world usable difference, and the end user experiernce it will be much more than 10x leap!

I expect people to be stunned when they see games, and we can all forget about specs.

Agreed, but I dont know if it'll happen right out of the gate. I know the average person we'll easily be able to tell the difference and see the jump. It just may take a year or two for our jaws to drop and to forget about the specs. I mean a lot of people already a certian level there expecting cause they've seen Star wars 1313 and Watch Dogs.
 
Marginal, insignificant update, and probably not worth the added yield issues since it's not as established as GCN

You know this how?

It's like you just described the biggest piece of shit they could've put in there. Like, who came up with this shit "new" gpu? Guys at AMD sat in a room and were like "How could we make a shit update? We'll ask James: Well, you make a marginal insignificant update and bring along a bunch of yield issues, then you slap a 2 at the end and voila."

I mean seriously. lol I guess I had some fun with that one.
 
If you look at the current gen console hardware, and how the games look on them, and then look at the specs on the next gen consoles, it's a pretty nice leap. It doesn't mean a damn thing how it compares to conventional PC hardware.

For Orbis we are looking at a GPU with ~8-10X RSX performance, ~8-10X bandwidth, ~10X PPU performance, and 8X the RAM.

It's a true next-gen leap. Durango is slightly more disappointing at face value, but is still a big leap.

It may not seem like much compared to high end PCs of today, but it doesn't need to be. Further, back when the 360 launched, GPUs weren't as power hungry at the high end as they are today...those monstrous power requirements aren't feasible in a closed box.

I expect people to be stunned when they see games, and we can all forget about specs.

Naughty Dog will definitely show off a game trailer, I'm certain, by this year's E3. By that time they are 1.5 years in to next-gen development....if it weren't for next-gen increasing development time (initially), that would almost be enough time for Naughty Dog to have a game ready for release.
 
I'm a bit annoyed that so far it looks like neither Microsoft nor Sony have gone with a GCN 2 GPU. I think that's the least they could have done tbh. I mean, I appreciate the 4GB GDDR5 Sony put in, but come on...

Just a friendly reminder, GCN 2 has very minor differences from the original GCN. It offers virtually no savings in heat, and the architectural improvements are minimal at best. What GCN2 will do however, is offer headaches, because it hasn't hit mass production.
 
You know this how?

It's like you just described the biggest piece of shit they could've put in there. Like, who came up with this shit "new" gpu? Guys at AMD sat in a room and were like "How could we make a shit update? We'll ask James: Well, you make a marginal insignificant update and bring along a bunch of yield issues, then you slap a 2 at the end and voila."

I mean seriously. lol I guess I had some fun with that one.

Um, no where did I say GCN2 was crap. It's just not a huge improvement, and Sony/Microsoft are pushing the boundaries for SoC design, so they probably want to make sure certain components aren't going to be show stoppers when it comes time to fab.
 
For the millionith time no its not! Theres a heavy amount of customization going on here with the Liverpool APU, the GPU, and the architecture, but it even goes way beyond that. Drek explains it well here, other people have even gone into more detail on this subject, but I cant find the posts right now and dont want to page through 40 pages right now.



Your going to be able to do things on this hardware that you would never be able to do with 7850 GPU in a PC. Hell even a 680 wont be able to achieve what Orbis and Durango will be able to do 4 years down the line when the devs REALLY start to maximize the hardware.
They will maximize software down the line not hardware. They choose what to use their resources on, what kind of AA they can get away with, lowering resolution and frame rates to push hardware is not magic, it's just compromise.

Yes consoles are more efficient cause there's no overhead and hardware is specifically focused on gaming. Features like full HSA which consoles might get will come out a year after on pc. Other optimizations consoles have, pcs don't need to worry about cause they can brute force through them. Before console hardware was quite unique so it was hard to compare to pcs but now with x86 architecture and amd supplying CPU and gpu, it's the first time consoles are so similar to pcs. Don't buy into this console magic bs. It's ignorant. It comes down to efficiencies, optimization and compromises. No magic
 
Um, no where did I say GCN2 was crap. It's just not a huge improvement, and Sony/Microsoft are pushing the boundaries for SoC design, so they probably want to make sure certain components aren't going to be show stoppers when it comes time to fab.

But this is the thing, you and thuway just said that the Seaislands gpus (GCN2) are gonna bring very very little gain and more trouble than it's worth.

Now, if the card is a marginal upgrade then that means that it's made to take advantage of a more mature manufacturing process. Which sounds like opposite of bad yields. A performance gain of 30% would be really big, which is what AMD wants/wanted, but I would guess anywhere up to 20%. One of the things AMD is advancing though is HSA and GCN compatibility with it.

So if we add full HSA compatibility, get a 20% increase in performance, on a mature 28 nm process that won't bring bad yields...

It doesn't sound like Geforce titan, but I don't think it would be an insignificant update for a closed box environment.

That's my 2 cents.
 
Just a friendly reminder, GCN 2 has very minor differences from the original GCN. It offers virtually no savings in heat, and the architectural improvements are minimal at best. What GCN2 will do however, is offer headaches, because it hasn't hit mass production.

theres got to be some difference, or they wouldnt of released it at all, and wouldnt of called it GCN2. What is it supposed to do differently? ANy new features? 100% sure it more efficient to some degree, 15%?
 
Just a friendly reminder, GCN 2 has very minor differences from the original GCN. It offers virtually no savings in heat, and the architectural improvements are minimal at best. What GCN2 will do however, is offer headaches, because it hasn't hit mass production.

I read it was a small improvement in performance and better on energy efficiency too? I just think the next gen consoles GPUs could do with any ounce of extra power. I get that it's closed platform and aiming for something more affordable, but as time passes I realise it's not the bump in gpu I would have expected. Guess it all depends on price.
 
I read it was a small improvement in performance and better on energy efficiency too? I just think the next gen consoles GPUs could do with any ounce of extra power. I get that it's closed platform and aiming for something more affordable, but as time passes I realise it's not the bump in gpu I would have expected. Guess it all depends on price.

energy effecincies is exactly what consoles need. The evolution with gpus have required more power to run them which is a bottleneck for consoles. If Gpus can become more efficient with the same or slightly better performance, then you can actually use high end gpus in consoles depending how much power they require.

I think both sony and microsoft are just aiming for good enough this time. If they can get that at the lowest cost, they won't care what's new and exciting.
 
All the big hitters are coming in 2014. A shame wii u is already on the market, and ps4 & x3 are racing each other to get to market, this year.
 
Top Bottom