Rumor: Xbox 3 = 6-core CPU, 2GB of DDR3 Main RAM, 2 AMD GPUs w/ Unknown VRAM, At CES

Status
Not open for further replies.
gatti-man said:
It means sony spent alot of money on a cpu that didnt best suit its gaming needs.
According to who exactly? While it certainly presented development issues, particularly in the beginning since Sony didn't have good tools and libraries available, who's claiming it doesn't have usage for gaming?

While it can be argued that utilizing a relatively weak GPU that needs graphics assistance from the CPU is not a great choice, that doesn't really reflect on the usefulness of the CPU itself - that was an architectural decision. Kutaragi's vision, even with the EE in PS2, was for gaming to start doing all sorts of non-graphics simulation work in order to push gaming forward. Did that pan out? Not necessarily, or at least not with 3rd parties (1st party titles have done some interesting work with it), but that's more of a general issue with multi-platform titles.


Let me put it this way. Look at the 'leaks' for Wii U and Xbox '720'. Notice anything regarding the CPU's? They're both going with a lot more cores than the current generation. Obviously they too feel there is a usefulness in gaming to have a ton of parallel general purpose computing power for simulation, and other functions. Otherwise why incur the expense?

So if anything, not only was CELL cutting-edge in terms of tech, but also in terms of the direction gaming would eventually move. The industry is quite literally following in the foot steps of Ken's goals. CELL was simply too early due to the cost constraints multi-platform titles elicit. Moving forward, we're going to see a serious increase in the sorts of simulation work, etc. done in gaming because all of the consoles are moving to a ton of CPU performance.




Cell was tied to bluray for decoding and was part of its bluray format push, it was never about gaming.
Bullshit.

If Sony only cared about bluray decoding, they would have simply used a traditional CPU in conjunction with dedicated A/V decoders ... the same A/V decoders present in their already released BD players. Relative to developing and fabbing CELL, it would have been far cheaper and would not have risked yield issues.

They absolutely did not need CELL to support BD functionality.

My argument is against all those using the ps3 as a metric for why cutting edge gaming consoles dont work. The ps3 was never cutting edge for gaming is my argument.
I disagree.

Should they have went with a better GPU and memory architecture? Certainly. I think the overall system design was flawed, but CELL in itself was not really the problem (other then budget implications).
 
eastmen said:
the ps2 had 4MB of edram for the gpu. It had more bandwidth than what the graphics chip in the ps3 has .

just for better info , it offeerd 48GB/s bandwidth
In an imaginary scenario, the Xbox 360 hardware would actually make PS2 emulation possible.

I find that strange.

Shouldn't CELL make up for this performance regardless, or was it not capable since Sony's original intention was to leave PS2 hardware on the PS3 for backwards compatibility forever?
 
H_Prestige said:
All those ps2 games on psn right now, and every single one yet to be released, are being emulated.
There's PS2 games on PSN that use general emulation to run?

News to me.






duk said:
one things for sure 720s architecture will be dev friendly just like 360
MS's strength in this regard is arguably less to do with the HW architecture, as it is to do with tools, libraries, and development environment.
 
claviertekky said:
In an imaginary scenario, the Xbox 360 hardware would actually make PS2 emulation possible.

I find that strange.
Possibly, though IIRC it isn't just about bandwidth. There were certain custom features in the GPU that simply don't exist on current GPU's ... and would take several more clocks to accomplish. The GS was a very unique design.

Shouldn't CELL make up for this performance regardless, or was it not capable since Sony's original intention was to leave PS2 hardware on the PS3 for backwards compatibility forever?
Using CELL to emulate the GPU isn't tenable
 
Raistlin said:
According to who exactly? While it certainly presented development issues, particularly in the beginning since Sony didn't have good tools and libraries available, who's claiming it doesn't have usage for gaming?

While it can be argued that utilizing a relatively weak GPU that needs graphics assistance from the CPU is not a great choice, that doesn't really reflect on the usefulness of the CPU itself - that was an architectural decision. Kutaragi's vision, even with the EE in PS2, was for gaming to start doing all sorts of non-graphics simulation work in order to push gaming forward. Did that pan out? Not necessarily, or at least not with 3rd parties (1st party titles have done some interesting work with it), but that's more of a general issue with multi-platform titles.


Let me put it this way. Look at the 'leaks' for Wii U and Xbox '720'. Notice anything regarding the CPU's? They're both going with a lot more cores than the current generation. Obviously they too feel there is a usefulness in gaming to have a ton of parallel general purpose computing power for simulation, and other functions. Otherwise why incur the expense?

So if anything, not only was CELL cutting-edge in terms of tech, but also in terms of architectural design. The industry is quite literally following in the foot steps of Ken's goals. CELL was simply too early due to the cost constraints multi-platform titles elicit. Moving forward, we're going to see a serious increase in the sorts of simulation work, etc. done in gaming because all of the consoles are moving to a ton of CPU performance.





Bullshit.

If Sony only cared about bluray decoding, they would have simply used a traditional CPU in conjunction with dedicated A/V decoders ... the same A/V decoders present in their already released BD players. Relative to developing and fabbing CELL, it would have been far cheaper and would not have risked yield issues.

They absolutely did not need CELL to support BD functionality.


I disagree.

Should they have went with a better GPU and memory architecture? Certainly. I think the overall system design was flawed, but CELL in itself was not really the problem (other then budget implications).



I can't agree with you here.


There are two ways to make a multi core cpu .

The first way is to create a single core and multiply it on the cpu.

The second way is to create multiple cores for diffrent reasons and either multiple those multiple cores or add in diffrent amounts . Some arm chips are going this way and obviously cell.


The main thing with cell is it had one control cpu which is very simliar to a single core in the xbox 360 and then 7 less powerful cores.

The real problem with the ps3 is that its unbalanced.

You have a really large cpu at 234m tranistors vs waternoose at 165m tranistors.

Then you have a dx 8 gpu with the geforce 7600 . While MS was able to go with a a dx 9 gpu .

Then you have the split memory pool and the really bad bandwidth between the cell and ddr ram .


Sony def could have built a better system esp with a $600 budget.
 
Raistlin said:
There's PS2 games on PSN that use general emulation to run?

News to me.


So you missed the PS2 classics stuff?

it does the same thing like running a ps1 game (dsyncs controller, separates you from PSN/XMB, etc).

So they either cracked it or got some games which don't require that crazy bandwidth to run.


The main thing with cell is it had one control cpu which is very simliar to a single core in the xbox 360 and then 7 less powerful cores.

Except they are not.

The main problem with them is that they are all contolled by that one cpu and only does stuff when asked.
 
I don't think EDRAM is the excuse here if CELL is as powerful as what some people are chiming here.

I think Sony realizes more powerful hardware is required for PS2 software emulation to be fully realized. I've seen a PS2 emulator struggling to run on a beefy PC setup.

PS2 software emulation should be fully realized with the PS4.
 
eastmen said:
I can't agree with you here.


There are two ways to make a multi core cpu .

The first way is to create a single core and multiply it on the cpu.

The second way is to create multiple cores for diffrent reasons and either multiple those multiple cores or add in diffrent amounts . Some arm chips are going this way and obviously cell.


The main thing with cell is it had one control cpu which is very simliar to a single core in the xbox 360 and then 7 less powerful cores.

The real problem with the ps3 is that its unbalanced.

You have a really large cpu at 234m tranistors vs waternoose at 165m tranistors.

Then you have a dx 8 gpu with the geforce 7600 . While MS was able to go with a a dx 9 gpu .

Then you have the split memory pool and the really bad bandwidth between the cell and ddr ram .


Sony def could have built a better system esp with a $600 budget.
My understanding was costs made it impossible to get the performance they wanted in a symmetrical design. What you are arguing is the specific implementation of CELL ... not the overall intent and goals, which is what I'm talking about. Your post doesn't really disagree with my points.



That said, from what I've been reading the asymmetric design of CELL wasn't necessarily the biggest issue. First off, tools, libraries, etc. were just not there in the beginning. As far as the architecture itself though, one of the main complaints is the lack of a shared pool of memory for the SPE's to communicate. This is one of the things rectified in the Wii U CPU, and one would assume Xbox's as well.
 
Infinite Justice said:
So you missed the PS2 classics stuff?

it does the same thing like running a ps1 game (dsyncs controller, separates you from PSN/XMB, etc).

So they either cracked it or got some games don't require that crazy bandwidth to run.
That's probably it.
 
claviertekky said:
I don't think EDRAM is the excuse here if CELL is as powerful as what some people are chiming here.
It's not an "excuse". It's a reason. In engineering we have this concept called reality, and sometimes it imposes certain restrictions. Cell is very good at what it does, but it's not a GPU, so PS2 being hard to emulate in general because of its exotic GPU architecture has no implications on Cell either way.

Infinite Justice said:
The only problem with them is that they are all contolled by that one cpu and only does stuff when asked.
That's not really correct. SPEs can independently issue DMA requests.
 
Durante said:
It's not an "excuse". It's a reason. In engineering we have this concept called reality, and sometimes it imposes certain restrictions. Cell is very good at what it does, but it's not a GPU, so PS2 being hard to emulate in general because of its exotic GPU architecture has no implications on Cell either way.
Right I'm saying saying no EDRAM can't be true for people who clamor CELL's power.
 
Your Excellency said:
Yeh. If the PS4 launched with GTAV with graphics identical to the trailer but with the sole additions of fully destructible environments and cities teeming with traffic/people, it would honestly spark a revolution, and set the bar in ways we never thought possible. Suddenly destructible environments would take precedence over 1080p and other developers would focus on gameplay enhancements (AI, destructibility, huge crowds) over graphics enhancements (1080p, 16xAA).

You don't know how awesome it is for me that you posted this. So often during "next gen" or even current gen game discussions all the talk is nothing but super sweet grafix, while I'm sitting here thinking "Yeah that's nice but what about the gameplay?" So often I feel like an outsider.

I really hope that a developer will buck the trend of focusing on the shiny things and put most of the resources, within reason, into all the enhancements you and I have previously mentioned. Like you, I truly believe that if a (talented) dev team were to focus on those elements, a revolution would be set ablaze in the gaming world equal to the one R* created with GTAIII & the sandbox genre.

diffusionx said:
The problem with this isn't hardware but the fact that people actually have to sit there and make all those crappy buildings that 95% of people won't bother with.

Not necessarily. I don't see any reason why a set amount of core interior/exterior building/housing/whatever structures could be built by the designers, then generated and pieced together ala lego by the game itself. When it comes down to it there isn't that many differences between housing layouts and even less in apartment/condos.

Just imagine in GTA VI that your character is a corrupt Mayor/Senator by day, and the ultimo badass by night. You completely level the downtown core then set in motion the rebuilding of that same area doing much of the building design yourself by use of in game creation tools. Obviously a stretch, and possibly not the best example but the gist is in there.
 
Shin Johnpv said:
This makes NO SENSE. Since they shared a large amount of the same codecs, plus early Blurays were still using Mpeg2 for their encode.
http://www.anandtech.com/show/2136/3

"We could have used a less powerful CPU, as our HD-DVD movies proved to be less stressful on our hardware than the Blu-ray movies we've tested, but we stuck with the X6800 for consistency with our previous article."
 
Infinite Justice said:
So you missed the PS2 classics stuff?

it does the same thing like running a ps1 game (dsyncs controller, separates you from PSN/XMB, etc).

So they either cracked it or got some games don't require that crazy bandwidth to run.
Heh ... actually yeah, I did miss it.

My assumption is that either they updated the games to use some native API's calls in problem areas ... and/or it's as you said ... they choose games that don't require the crazy bandwidth to run.

It's not like they haven't been trying to get emulation to work. They've been since before launch. Obviously they couldn't get it working with enough games, otherwise they wouldn't have included varying levels of PS2 hardware along the way.





claviertekky said:
I don't think EDRAM is the excuse here if CELL is as powerful as what some people are chiming here.
What are you talking about? The problem with PS2 emulation was always because of the GS, not the EE. CELL was used to emulate the EE on all second generation PS3's (1st gen in Europe IIRC).

CELL isn't some magical processor that can emulate a CPU and GPU ... particularly when said GPU did a lot of really custom stuff with dedicated silicon.
 
Raistlin said:
My understanding was costs made it impossible to get the performance they wanted in a symmetrical design. What you are arguing is the specific implementation of CELL ... not the overall intent and goals, which is what I'm talking about. Your post doesn't really disagree with my points.



That said, from what I've been reading the asymmetric design of CELL wasn't necessarily the biggest issue. First off, tools, libraries, etc. were just not there in the beginning. As far as the architecture itself though, one of the main complaints is the lack of a shared pool of memory for the SPE's to communicate. This is one of the things rectified in the Wii U CPU, and one would assume Xbox's as well.

What i'm saying is if they went with a symmetrical design like MS did they'd loose cpu power but they would have been able to devote the silicon costs to another part of the system. Like the gpu or even more ram.

Think about it

2005 $300/400 3 core cpu , dx 9 gpu , 512 megs of ram
2006 $500/600 7 core cpu , dx 8 gpu , 512 megs of ram.

Just think if they went with something smaller silicon wise for the gpu. They could have easily added more ram or reduced the cost of the system.

The cell had alot of problems and tools were one of them . one would think with an easier symmetrical cpu they would have been able to create better tools faster.


Modern cpu's have large l3 cache for the chips to share. Even the xbox 360's cpu had 1MB of cache for the 3 chips to share , even the gpu can look at that cache

When you look at the design of the xbox 360 you can see MS had an idea of what to do and made sure that the cpu and gpu played off each other in the best ways possible.

When you look at the ps3 you can see its just a mess and sony had no idea what they were doing.

MS has their hands in direct x and so they know the trends and whats coming years out and know how to implement it the best way. Its an advantage they wont loose this gen.
 
Raistlin said:
What are you talking about? The problem with PS2 emulation was always because of the GS, not the EE. CELL was used to emulate the EE on all second generation PS3's (1st gen in Europe IIRC).

CELL isn't some magical processor that can emulate a CPU and GPU ... particularly when said GPU did a lot of really custom stuff with dedicated silicon.
That's what I feel some people are saying that the CELL is all mighty powerful and is that magical processor.

I said earlier that in an imaginary scenario, the X360 hardware would be better suited in emulating PS2 games.
 
gatti-man said:
http://www.anandtech.com/show/2136/3

"We could have used a less powerful CPU, as our HD-DVD movies proved to be less stressful on our hardware than the Blu-ray movies we've tested, but we stuck with the X6800 for consistency with our previous article."
You are using out-of-context quotes to fight a battle against specs. That is rather ridiculous. HD-DVD and Blu-ray specs both support the same codec (H264 AVC), so any device that fulfills the spec will have to be able to decode that. What movies Derek Wilson had at hand when he wrote that article has no bearing on the discussion.
 
Raistlin said:
According to who exactly? While it certainly presented development issues, particularly in the beginning since Sony didn't have good tools and libraries available, who's claiming it doesn't have usage for gaming?

While it can be argued that utilizing a relatively weak GPU that needs graphics assistance from the CPU is not a great choice, that doesn't really reflect on the usefulness of the CPU itself - that was an architectural decision. Kutaragi's vision, even with the EE in PS2, was for gaming to start doing all sorts of non-graphics simulation work in order to push gaming forward. Did that pan out? Not necessarily, or at least not with 3rd parties (1st party titles have done some interesting work with it), but that's more of a general issue with multi-platform titles.


Let me put it this way. Look at the 'leaks' for Wii U and Xbox '720'. Notice anything regarding the CPU's? They're both going with a lot more cores than the current generation. Obviously they too feel there is a usefulness in gaming to have a ton of parallel general purpose computing power for simulation, and other functions. Otherwise why incur the expense?

So if anything, not only was CELL cutting-edge in terms of tech, but also in terms of the direction gaming would eventually move. The industry is quite literally following in the foot steps of Ken's goals. CELL was simply too early due to the cost constraints multi-platform titles elicit. Moving forward, we're going to see a serious increase in the sorts of simulation work, etc. done in gaming because all of the consoles are moving to a ton of CPU performance.





Bullshit.

If Sony only cared about bluray decoding, they would have simply used a traditional CPU in conjunction with dedicated A/V decoders ... the same A/V decoders present in their already released BD players. Relative to developing and fabbing CELL, it would have been far cheaper and would not have risked yield issues.

They absolutely did not need CELL to support BD functionality.


I disagree.

Should they have went with a better GPU and memory architecture? Certainly. I think the overall system design was flawed, but CELL in itself was not really the problem (other then budget implications).

Cell can be used for gaming so can ARM so can a pentium 4 so can any processor that doesnt mean it was designed and optimized for it.

Sony didnt ONLY care about optimizing for HT but that was on the table. They had a bunch of agendas for the ps3 and the gaming side suffered for it. No the same AV decoders were not present at the time of the ps3 launch. They weren't present for months after either. Im staggered here at the willingness of so many people to make claims without sources. A simple google search would prove you wrong. PS3 set the bluray standard because of cell.


Durante said:
You are using out-of-context quotes to fight a battle against specs. That is rather ridiculous. HD-DVD and Blu-ray specs both support the same codec (H264 AVC), so any device that fulfills the spec will have to be able to decode that. What movies Derek Wilson had at hand when he wrote that article has no bearing on the discussion.
What was the most commonly used HD-DVd codec? VC-1. VC-1 is far easier to decode than the h.264 codec. Its easier on hardware. How is that out of context? Not to mention java which literally killed all firtst gen players besides the ps3.
 
Durante said:
You are using out-of-context quotes to fight a battle against specs. That is rather ridiculous. HD-DVD and Blu-ray specs both support the same codec (H264 AVC), so any device that fulfills the spec will have to be able to decode that. What movies Derek Wilson had at hand when he wrote that article has no bearing on the discussion.

Blu-ray has java in its specs and hd-dvd used an microsoft language (i forget which)

java was much more demanding in specs. its why 5 years later any java enabled bluray discs take forever to load
 
gatti-man said:
http://www.anandtech.com/show/2136/3

"We could have used a less powerful CPU, as our HD-DVD movies proved to be less stressful on our hardware than the Blu-ray movies we've tested, but we stuck with the X6800 for consistency with our previous article."

Could it be that the HD DVD movies he tested were of lower bitrate than the bluray movies he tested?


eastmen said:
Blu-ray has java in its specs and hd-dvd used an microsoft language (i forget which)

java was much more demanding in specs. its why 5 years later any java enabled bluray discs take forever to load

Java is for bluray menu. Once you start playing the movie, that has absolutely no bearing with decoding. Heck you could use a player like MPC-HC and directly play the movie by picking the right file.
 
eastmen said:
Blu-ray has java in its specs and hd-dvd used an microsoft language (i forget which)

java was much more demanding in specs. its why 5 years later any java enabled bluray discs take forever to load
Uh...what?

Java is inefficient. People use Java since it's easily compatible on multiple systems.
 
tycoonheart said:
Could it bet that the HD DVD movies he tested were of lower bitrate than the bluray movies he tested with?
HDDVD's were all lower bitrate than blurays if you took an average. Thats another good reason why bd was tougher on hardware. Unless they were mirror transfers. Man I can remember all the raging arguments on AVS about bitrates and how HDDVD was going to ruin pq because of it.
 
duk said:
one things for sure 720s architecture will be dev friendly just like 360

Developer friendly but extremely proprietary.

I won't be surprised to see MS and AMD work on a radical GPU that requires much more advanced graphical features than what's available in DX11 / the latest OpenGL.

This can allow benefits MS in several ways.

1. It prevents developers from easy porting from Wii-U to 720, or cross development platform on the PS4 and 720. Studios with limited budget will have to choose on the platforms, and MS will do everything to ensure that studios choose the 720. This will result in "free" exclusives", at least in the early cycle of the generation.

2. It'll allow MS to dictate where the graphics industry is going, especially on the PC front. They did this with Xenos, and I am 100% sure they will do it again with the next XGPU.

3. Because of two, cross platform development between PC and 720 should be extremely trivial.

If MS can release a console that's radically different in development process than Wii-U and PS4, but extremely powerful, extremely easy to develop for, and has great synergy with PC, they basically can hold the industry by its balls.

Sony can counter with Vita / PS4, but I am not sure where that leaves Nintendo.
 
gatti-man said:
What was the most commonly used HD-DVd codec? VC-1. VC-1 is far easier to decode than the h.264 codec. Its easier on hardware. How is that out of context? Not to mention java which literally killed all firtst gen players besides the ps3.
"On average", "commonly", and so on do not matter in the least. What matters is what the spec requires the hardware to be capable of. Any HD-DVD player sold needs to be able to decode H264 AVC, regardless of how common that occurrence is.
 
gatti-man said:
HDDVD's were all lower bitrate than blurays if you took an average. Thats another good reason why bd was tougher on hardware. Unless they were mirror transfers. Man I can remember all the raging arguments on AVS about bitrates and how HDDVD was going to ruin pq because of it.

HDDVD having lower bitrate had nothing to do with HDDVD itself. It was the publisher who were releasing lower bitrate movies. According to the spec, wasn't the max bitrate like 40 or something like that for HDVD?
 
eastmen said:
What i'm saying is if they went with a symmetrical design like MS did they'd loose cpu power but they would have been able to devote the silicon costs to another part of the system. Like the gpu or even more ram.

Think about it

2005 $300/400 3 core cpu , dx 9 gpu , 512 megs of ram
2006 $500/600 7 core cpu , dx 8 gpu , 512 megs of ram.

Just think if they went with something smaller silicon wise for the gpu. They could have easily added more ram or reduced the cost of the system.
What's there to think about? I never stated this wasn't the case. They made a decision regarding what sort of CPU performance they wanted.

Was it too early, particularly since 3rd parties wouldn't use it on multi-platform titles (or arguably any, since 3rd party exlusives have pretty much gone away)? Yes, I think it was. Ken kind of did the same thing with EE ... CELL was just taking it way further.

I really haven't been arguing for or against their decision in this. I was simply pointing out:
  • that from a technical standpoint, CELL was obviously advanced
  • that the design goals, while probably too early, are the direction everyone is going in


The cell had alot of problems and tools were one of them . one would think with an easier symmetrical cpu they would have been able to create better tools faster.
I'm not sure that's the case

Modern cpu's have large l3 cache for the chips to share. Even the xbox 360's cpu had 1MB of cache for the 3 chips to share , even the gpu can look at that cache
Who's disagreeing? I pointed this out.

When you look at the design of the xbox 360 you can see MS had an idea of what to do and made sure that the cpu and gpu played off each other in the best ways possible.

When you look at the ps3 you can see its just a mess and sony had no idea what they were doing.
That's a bit of hyperbole.

As an aside, it's not like MS designed Xenon. IBM actually used what they learned working on CELL for it.

MS has their hands in direct x and so they know the trends and whats coming years out and know how to implement it the best way. Its an advantage they wont loose this gen.
Why is this directed at me exactly?
 
Proelite said:
Developer friendly but extremely proprietary.

I won't be surprised to see MS and AMD work on a radical GPU that requires much more advanced graphical features than what's available in DX11 / the latest OpenGL.

This can allow benefits MS in several ways.

1. It prevents developers from easy porting from Wii-U to 720, or cross development platform on the PS4 and 720. Studios with limited budget will have to choose on the platforms, and MS will do everything to ensure that studios choose the 720. This will result in "free" exclusives", at least in the early cycle of the generation.

2. It'll allow MS to dictate where the graphics industry is going, especially on the PC front. They did this with Xenos, and I am 100% sure they will do it again with the next XGPU.

3. Because of two, cross platform development between PC and 720 should be extremely trivial.

If MS can release a console that's radically different in development process than Wii-U and PS4, but extremely powerful, extremely easy to develop for, and has great synergy with PC, they basically can hold the industry by its balls.
These days, I don't see how that can happen. The part of a graphics engine that interacts with the API layer is relatively small, and usually designed to be interchangeable. And existing APIs allow you to do most everything you could want to with the hardware resources on the GPU.

I really can't think of any "much more advanced graphical features" that would be important enough for developers to limit themselves to one (and a half) platform(s) in such a way.
 
gatti-man said:
Cell can be used for gaming so can ARM so can a pentium 4 so can any processor that doesnt mean it was designed and optimized for it.
And ARM or pentium 4 were?

Oh I forgot, there were tons of processors available that offered the parallel performance of CELL when PS3 was being created. Moreover, they were designed and optimized for gaming.

Sony didnt ONLY care about optimizing for HT but that was on the table. They had a bunch of agendas for the ps3 and the gaming side suffered for it. No the same AV decoders were not present at the time of the ps3 launch. They weren't present for months after either. Im staggered here at the willingness of so many people to make claims without sources. A simple google search would prove you wrong. PS3 set the bluray standard because of cell.
Go find me a source that shows PS3 was the first BD player release.

Guess what, you can't because it wasn't. Explain how the previously released BD players functioned if they couldn't decode the audio and video? lol
 
Durante said:
These days, I don't see how that can happen. The part of a graphics engine that interacts with the API layer is relatively small, and usually designed to be interchangeable. And existing APIs allow you to do most everything you could want to with the hardware resources on the GPU.

I really can't think of any "much more advanced graphical features" that would be important enough for developers to limit themselves to one (and a half) platform(s) in such a way.

I agree, but I can see MS doing everything they can to make cross platform development between next gen consoles as difficult as possible.
 
gatti-man said:
HDDVD's were all lower bitrate than blurays if you took an average. Thats another good reason why bd was tougher on hardware. Unless they were mirror transfers. Man I can remember all the raging arguments on AVS about bitrates and how HDDVD was going to ruin pq because of it.

You should stop, this is getting embarrassing. You're conflating wrapper formats with disc formats with codecs, and much else besides.
 
Proelite said:
Developer friendly but extremely proprietary.

I won't be surprised to see MS and AMD work on a radical GPU that requires much more advanced graphical features than what's available in DX11 / the latest OpenGL.

This can allow benefits MS in several ways.

1. It prevents developers from easy porting from Wii-U to 720, or cross development platform on the PS4 and 720. Studios with limited budget will have to choose on the platforms, and MS will do everything to ensure that studios choose the 720. This will result in "free" exclusives", at least in the early cycle of the generation.

2. It'll allow MS to dictate where the graphics industry is going, especially on the PC front. They did this with Xenos, and I am 100% sure they will do it again with the next XGPU.

3. Because of two, cross platform development between PC and 720 should be extremely trivial.

If MS can release a console that's radically different in development process than Wii-U and PS4, but extremely powerful, extremely easy to develop for, and has great synergy with PC, they basically can hold the industry by its balls.

I disagree MS or any would try to odd-man-out themselves in terms of development process or required tech, but if any would or could, they'd actually be putting themselves in a dangerous position.

That's not going to happen. Differentiation will be possible between platforms, but IMO next-gen (as this gen between Ps3 and 360), it'll again be relatively superficial (i.e. maybe better textures/res/etc)..

No one's going to build a platform that carries fundamental tech lock-in, even if that were possible. It'd be extremely risky and would probably lead that platform being ignored in the face of two (viable) platforms that can share investment more than attracting default exclusives naturally. Better to add 'icing' features that devs can exploit easily in a multiplat context.
 
You can't say that Sony could have used a better GPU if they'd used a smaller CPU. if it was a single chip with a maximum transistor count, sure. But it isn't

I thought the rumours were that Sony wanted another CELL as GPU but it didn't pan out, so they had to source a standard GPU quickly. That short term need might have more to do with the quality of the GPU than CELL
 
Raistlin said:
What's there to think about? I never stated this wasn't the case. They made a decision regarding what sort of CPU performance they wanted.

Was it too early, particularly since 3rd parties wouldn't use it on multi-platform titles (or arguably any, since 3rd party exlusives have pretty much gone away)? Yes, I think it was. Ken kind of did the same thing with EE ... CELL was just taking it way further.

Third party exclusives went away because the ps3 failed. If the ps3 was the ps2 out of the gate then there would still be exclusives .

The problem was that Sony was stuck in their ways , they felt they could release a crappy system again and get tons of sales and force developers to dig the power out of their system. It worked for them with the psx and then again with the ps2. This time MS was able to come out first

The design of the ps3 was a mistake from the start

I really haven't been arguing for or against their decision in this. I was simply pointing out:
  • that from a technical standpoint, CELL was obviously advanced
  • that the design goals, while probably too early, are the direction everyone is going in



I'm not sure that's the case


Who's disagreeing? I pointed this out.


That's a bit of hyperbole.

As an aside, it's not like MS designed Xenon. IBM actually used what they learned working on CELL for it.


Why is this directed at me exactly?

MS had alot of input on both waternoose and Xenon and added alot to waternoose . IBM was working on the cpu design before Sony ever entered the conversation. They used the same design and modified it to make cell and modiefied it to make waternoose
 
tycoonheart said:
I meant bundled in Kinect. No chance they make the kinect an accessory given the way MS has been pushing it recently.

I mean, looking at the technology listed in this rumor, what it suggests the next xbox might have seems to be a step over the tech that existed when the xbox 360 was released.
Every chance they make kinect an accessory. On 360 it's a mid-life refresh to expand to casual markets as the price of the console drops to mass market prices

With 720 at launch, it'll be expensive and aimed at core gamers. Save kinect 2 for a few years down the line or just keep it compatible with kinect 1
 
gatti-man said:
What was the most commonly used HD-DVd codec? VC-1. VC-1 is far easier to decode than the h.264 codec. Its easier on hardware. How is that out of context? Not to mention java
which literally killed all firtst gen players besides the ps3.

As has already been said, what was common, what was most used, doesn't mean dog shit. If the spec says it decodes H264 then any player released has to support that, and both mediums spec wise supported almost the exact same formats. Including the supposed super hard H264.

Plus don't even bring in this most commonly used thing, when at the start of the formats the most commonly used codec on BD was mpeg2.
 
Proelite said:
I agree, but I can see MS doing everything they can to make cross platform development between next gen consoles as difficult as possible.

This is how I know you're dreaming. That would be the most horrific idea possible, considering the other 2 next gen systems and PC will be sharing development pipelines.

We had a Crytek rep today at school and he kinda hinted they are looking into next xbox tech stuff.

The kits are indeed out there in their initial rudimentary form.

Every chance they make kinect an accessory. On 360 it's a mid-life refresh to expand to casual markets as the price of the console drops to mass market prices

They wouldn't be pushing their entire internal (remaining) first parties to include Kinect functionality if they weren't planning on putting a Kinect 2.0 inside the box. As an accessory, you can never truly be successful.

MS had alot of input on both waternoose and Xenon and added alot to waternoose . IBM was working on the cpu design before Sony ever entered the conversation. They used the same design and modified it to make cell and modiefied it to make waternoose

The PPE in the Cell and a single core from Xenon are for all intents and purposes identical. This "6 core" rumour could seriously suggest 2 x Xenon (maybe overclocked). The process is cheap and mature now. That would put a fork into the AMD x86 CPU/APU rumours that have permeated over the last year, however.
 
Proelite said:
Developer friendly but extremely proprietary.

I won't be surprised to see MS and AMD work on a radical GPU that requires much more advanced graphical features than what's available in DX11 / the latest OpenGL.

This can allow benefits MS in several ways.

1. It prevents developers from easy porting from Wii-U to 720, or cross development platform on the PS4 and 720. Studios with limited budget will have to choose on the platforms, and MS will do everything to ensure that studios choose the 720. This will result in "free" exclusives", at least in the early cycle of the generation.

2. It'll allow MS to dictate where the graphics industry is going, especially on the PC front. They did this with Xenos, and I am 100% sure they will do it again with the next XGPU.

3. Because of two, cross platform development between PC and 720 should be extremely trivial.

If MS can release a console that's radically different in development process than Wii-U and PS4, but extremely powerful, extremely easy to develop for, and has great synergy with PC, they basically can hold the industry by its balls.

Sony can counter with Vita / PS4, but I am not sure where that leaves Nintendo.
XjldD.gif
 
Raistlin said:
And ARM or pentium 4 were?

Oh I forgot, there were tons of processors available that offered the parallel performance of CELL when PS3 was being created. Moreover, they were designed and optimized for gaming.


Go find me a source that shows PS3 was the first BD player release.

Guess what, you can't because it wasn't. Explain how the previously released BD players functioned if they couldn't decode the audio and video? lol
Because they couldn't output the high res audio codecs and downgraded it to dolby digital. In most first genn movies you had to actually choose lossless audio because launch players couldn't decode trueHD et al. Why are you throwing around attitude when you have zero clue what you are talking about. I never said ps3 was the first bluray player btw. However it was the best for years.
 
eastmen said:
MS had alot of input on both waternoose and Xenon and added alot to waternoose .
How would Microsoft have added anything to a processor architecture at a time when they couldn't even figure out adequate heatsink sizes?
 
BurntPork said:
http://i.imgur.com/XjldD.gif

There is precedence for their stupidity / evilness. I don't think they care about placing themselves into dangerous situations in terms of developer support, because they pretty much risk that with everything they do.
 
Shin Johnpv said:
As has already been said, what was common, what was most used, doesn't mean dog shit. If the spec says it decodes H264 then any player released has to support that, and both mediums spec wise supported almost the exact same formats. Including the supposed super hard H264.

Plus don't even bring in this most commonly used thing, when at the start of the formats the most commonly used codec on BD was mpeg2.
Links please. Mpeg2 was just Sony releases and mpeg2 was panned as a garbage codec immediately. Bitrate on bluray was far higher fyi.
 
higherARC05 said:
Hey is it also possible to add edram to the cpu cores?

Beast of a cpu.

Why not i believe the WiiU has edram on the cpu.
Question is would devs want that and how will it affect budget and is it even smart to do it.
 
Status
Not open for further replies.
Top Bottom