• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

360 Gpu Exposed, 4XMSAA etc , and PS3GPU was a late change.

Razoric said:
It's really hard to talk about PS3 HW vs Xbox 360 HW when you have so many biased people coming off the ATI vs Nvidia wars. So much misinformation. :\
Yeah. Combining MS vs Sony with ATI vs Nvidia is a recipe for a fanboy shitstorm of truly EPIC proportions.

If nothing else, it should be entertaining to watch. :D
 
Razoric said:
It's really hard to talk about PS3 HW vs Xbox 360 HW when you have so many biased people coming off the ATI vs Nvidia wars. So much misinformation. :\

Eeeh you really hit the point,sadly.
With Ati and nVidia GPUs being in future consoles it won't be only a fight between MS and Sony fanboys but everyone even if not interested in the console gaming world,as Ati and nVidia fanboys or biased PC hardware websites,will want to say their version of the story.
 
Lazy8s said:
Pimpwerx:

nVidia has already detailed some of the major changes they made that account for differences from past GeForces while revealing similarities that show it's scaled from past PC chips in significant ways.

Not at all. The graphics processor was largely designed by nVidia and based off of their architecture, and Sony is contributing mostly on the implementation and integration sides.

The purpose of those approaches is to move deferred rendering for visible surface determination and scene division for data size manageability closer to the device and further from the game software, getting some of the benefits PowerVR enjoys like fast stencil for shadows, fast AA, and fast Z check.

Embedded RAM set-ups like that have some philosophical similarities to TBDLR.

It includes it, but such a conventional architecture was definitely not built from the ground up for it. The memory requirements are better suited to a processor with low bandwidth requirements.

1. I'll say it again, we do not know what the internal makeup of the RSX is. We don't even have a block diagram, just that one slide they showed, which was very high-level, and more of a flow-chart. We don't even know how many pipes it has.

2. Phil Harrison already said RSX is being made by Sony, and for what it's worth, that means they can add whatever else they'd like. It could just be a memory controller, or it could the chaperone. As he said, NVidia designed part of the chip. Not the full thing. Again, we must wait and see. Hopefully Pana knows more that can be revealed.

3. Yup, frees up bandwidth and lets those shader cores work their magic without clogging the external bus(es).

4. Please don't break out the TBDR stuff on me again. PowerVR tech is overrated, and until they release a bleeding edge card that beats the BEST cards on the market, I will hate on them forever. Making the best midrange card means fuck all IMO.

5. Pfft, you are basing this on what? And you call it "conventional" before knowing what's inside the chip. RSX is NOT a G70. It's an adaptation. How much had to be modified is yet unknown. We have to wait and see what's inside and what else accounts for those 300M trannies before making comments like this. It's as bad as calling the Xenos an R500, which it's clearly such a different product. PEACE.
 
Razoric said:
It's really hard to talk about PS3 HW vs Xbox 360 HW when you have so many biased people coming off the ATI vs Nvidia wars. So much misinformation. :\

:lol

so true. How I long for the days when these debates ere carried out over at B3D! Now it spills over onto GAF!

On a side note, I can't wait until I read a glowing review of the RSX on THG /rimshot
 
Edge went to press before the E3 conferences, but was published on 18th May.

Here is what the Xbox360 specs were in Edge before the willy waving started.

CPU Game Math Performance:
9 billion dot product ops / sec
No change

Polygon Performance:
500 million triangles / sec
Xbox 1.5 indeed :lol when you consider the original's 150m / sec claim

Pixel Fill Rate:
16 gigasamples / sec
No change

Shader Performance:
48 billion shader ops / sec
Seems to have increase in a very short space of time?

HD Support:
720p, 1080i
1080p seems to have been added
 
Razoric said:
It's really hard to talk about PS3 HW vs Xbox 360 HW when you have so many biased people coming off the ATI vs Nvidia wars. So much misinformation. :\

The worlds of PC and console are merging creating new fanboy wars. I'm pretty used to the ATI/Nvidia bickering by now. :lol If you guys think GAF is bad at times you should see some of the battles on video card technology on PC boards. We are starting to get a small taste! :D
 
Nick Laslett said:
Edge went to press before the E3 conferences, but was published on 18th May.

Here is what the Xbox360 specs were in Edge before the willy waving started.

CPU Game Math Performance:
9 billion dot product ops / sec
No change

Polygon Performance:
500 million triangles / sec
Xbox 1.5 indeed :lol when you consider the original's 150m / sec claim

Pixel Fill Rate:
16 gigasamples / sec
No change

Shader Performance:
48 billion shader ops / sec
Seems to have increase in a very short space of time?

HD Support:
720p, 1080i
1080p seems to have been added



Geforce 6800 Ultra:
53 ops per clock * 400MHz = 21.2 billions ops per second

Geforce 6800 Ultra Extreme Edition
53 ops per clock * 450MHz = 23.9 billion ops per second

PS3 GPU (RSX)
136 ops per clock * 550MHz = 74.8 billion ops per second

Xbox 360 GPU (R500)
192 ops per clock * 500MHz = 96 billion ops per second
 
Ghost of Bill Gates said:


Xbox 360 GPU (R500)
192 ops per clock * 500MHz = 96 billion ops per second


It is a shame they didn't tell the Edge magazine journalists this before they went to press.

Because it says only 48 billion shader ops / sec in the Specs printed in the article.

I wonder how in the space of a few days it can have increased so much?

I wonder if Edge will have a correction when they print the PS3 specs next month?
 
Ghost of Bill Gates said:
Xbox 360 GPU (R500)
192 ops per clock * 500MHz = 96 billion ops per second

Comedy gold. It's actually 96 ops per clock * 500MHz = 48 billion ops per second.
 
Izzy said:
Comedy gold. It's actually 96 ops per clock * 500MHz = 48 billion ops per second.

Hey! GoBG read it on the Intarweb! It's gotta be true! ;)
 
Nick Laslett said:
It is a shame they didn't tell the Edge magazine journalists this before they went to press.

Because it says only 48 billion shader ops / sec in the Specs printed in the article.

I wonder how in the space of a few days it can have increased so much?

I wonder if Edge will have a correction when they print the PS3 specs next month?

huh? Edge...whats that?
 
Izzy said:
Comedy gold. It's actually 96 ops per clock * 500MHz = 48 billion ops per second.

actually,

It's faster than Microsoft has it listed. The eDRAM is like Super Memory that can handle certain GPU tasks. The GPU is codenamed XENOS. It also functions as the memory controller. 720p and 1080p with 4x AA is free. There is no performance hit. It has 48 agnostic pipelines and can do 98 billion ALU ops per second.

http://www.techreport.com/etc/2005q2/xbox360-gpu/index.x?pg=2
 
NVIDIA confirmed that the RSX is features full FP32 support, like the current generation GeForce 6 as well as ATI's Xbox 360 GPU. NVIDIA did announce that the RSX would be able to execute 136 shader operations per cycle, a number that is greater than ATI's announced 96 shader ops per cycle.

Link
 
Ghost of Bill Gates said:
Geforce 6800 Ultra:
53 ops per clock * 400MHz = 21.2 billions ops per second

Geforce 6800 Ultra Extreme Edition
53 ops per clock * 450MHz = 23.9 billion ops per second

PS3 GPU (RSX)
136 ops per clock * 550MHz = 74.8 billion ops per second

Xbox 360 GPU (R500)
192 ops per clock * 500MHz = 96 billion ops per second

To elaborate further on what others are suggesting:

You're comparing an incorrect floating point performance figure with shader op figures.

The figure for floating point ops for Xenos is 480 per cycle (or should be).

That comes out of 48 vector ops per cycle + 48 scalar ops per cycle (we're now talking in terms of shader ops).

1 vector op = 8 floating point ops. 1 scalar op = 2 floating point ops. If you do the math, you end up with 480 floating point ops per cycle.

This is not the same as shader ops. As you can see, there are multiple floating point ops to shader ops.

X360 can do 96 shader ops per cycle (48 vector + 48 scalar) - 48bn per second.
 
In other words:


Geforce 6800 Ultra:
53 ops per clock * 400MHz = 21.2 billions ops per second

Geforce 6800 Ultra Extreme Edition
53 ops per clock * 450MHz = 23.9 billion ops per second

PS3 GPU (RSX)
136 ops per clock * 550MHz = 74.8 billion ops per second

Xbox 360 GPU (R500)
96 ops per clock * 500MHz = 48 billion ops per second
 
Ghost of Bill Gates said:
actually,

It's faster than Microsoft has it listed. The eDRAM is like Super Memory that can handle certain GPU tasks. The GPU is codenamed XENOS. It also functions as the memory controller. 720p and 1080p with 4x AA is free. There is no performance hit. It has 48 agnostic pipelines and can do 98 billion ALU ops per second.

http://www.techreport.com/etc/2005q2/xbox360-gpu/index.x?pg=2

It's one think to be able to copy and paste stuff you find on tech websites, but it you don't understand them then don't pass them off as if you do. I'm not any where near a tech guy, and I can tell that the stuff you just tossed out there makes no sense - you're just dumping stuff you read somewhere with big numbers into the discussion.

Before long guys like Pana will have enough info to do a detailed comparison and we'll have a meaningful discussion. But just copying stuff you read in other websites and dropping them into a discussion when it's clear you have no idea what you are talking about isn't doing anything but increasing the rate at which you will be mocked.

(RAWYBM*WFTBBQ = 2x the PS3!!!)
 
Nick Laslett said:
Edge went to press before the E3 conferences, but was published on 18th May.

Here is what the Xbox360 specs were in Edge before the willy waving started.

CPU Game Math Performance:
9 billion dot product ops / sec

Xbox 360's GPU can also do additional 24 billion dot operations/sec.

According to Nvidia, RSX alone can do 51 billion dot operations/sec. o_O
 
CrimsonSkies said:
I doubt the PS3 GPU will be better than the Xbox GPU. We will see though. And if it is then Microsoft should just leave the market.

RSX is easily more powerful than Xbox 360 GPU. But that CERTAINLY doesn't mean MS should just leave the market.
 
Ghost of Bill Gates said:
actually,

It's faster than Microsoft has it listed. The eDRAM is like Super Memory that can handle certain GPU tasks. The GPU is codenamed XENOS. It also functions as the memory controller. 720p and 1080p with 4x AA is free. There is no performance hit. It has 48 agnostic pipelines and can do 98 billion ALU ops per second.

http://www.techreport.com/etc/2005q2/xbox360-gpu/index.x?pg=2



:lol :lol Stop it, you're killing me.

....on second thought, carry on.
 
gofreak said:
To elaborate further on what others are suggesting:

You're comparing an incorrect floating point performance figure with shader op figures.

The figure for floating point ops for Xenos is 480 per cycle (or should be).

That comes out of 48 vector ops per cycle + 48 scalar ops per cycle (we're now talking in terms of shader ops).

1 vector op = 8 floating point ops. 1 scalar op = 2 floating point ops. If you do the math, you end up with 480 floating point ops per cycle.

This is not the same as shader ops. As you can see, there are multiple floating point ops to shader ops.

X360 can do 96 shader ops per cycle (48 vector + 48 scalar) - 48bn per second.

Ok...now please bear with me because I am not a techie, but this is something that has been bothering me. I understand your math and you are correct. But, forgive me if I'm wrong, but because it is running a Unified Shader Array and it can run 2 ops per cycle, why are we assuming that its 48 vector + 48 scaler when we know them to be agnostic?

The Xbox 360 GPU breaks new ground in that the hardware shader units are intelligent as well. Very simply, the Xbox 360 hardware shader units can do either vertex or pixel shaders quickly and efficiently. Just think of the Xbox 360 shaders as being analogous to SIMD shader units (Single Instructions carried out on Multiple Data).

http://www.hardocp.com/article.html?art=NzcxLDM=

If that is the case, would it then be safe to assume that you could theoretically do 96 Vector Ops or 96 Shader Ops or any combination in between (76 vector + 20 shader = 96 total ops)? If numbers are more impressive using Vector Ops, perhaps that is where the discrepency lies.

So we have 48 shaders, each one of them does 4 floating-point ops per cycle, so 196 floating ops per clock.

So if it?s dominated by vertices you get more resources for vertices, but if it?s dominated by pixels you get more resources towards pixels, or any other kind of problem. It?s a general purpose, well not a general purpose processor, but it is a processor with a good general instruction set and it can operate on a variety of different kinds of data. So unified shader means we have one set of shader hardware and it can operate on any problem.

http://www.firingsquad.com/features/xbox_360_interview/default.asp
 
3rdman said:
If that is the case, would it then be safe to assume that you could theoretically do 96 Vector Ops or 96 Shader Ops or any combination in between (76 vector + 20 shader = 96 total ops)? If numbers are more impressive using Vector Ops, perhaps that is where the discrepency lies.

Vector and Scalar ops don't map exclusively to vertex and pixel ops respectively. They're used in both. It's 48 of both simultaneously.

Extremetech have another (surprise!) confusing article up on Xenos:

http://www.extremetech.com/article2/0,1558,1818139,00.asp

Each of the ALUs can perform 5 floating-point shader operations. Thus, the peak computational power of the shader units is 240 floating-point shader ops per cycle, or 120 billion shader ops per second at 500MHz.

This would suggest it's only half as powerful as it should be. Shouldn't that be 10 floating point ops per ALU?
 
Izzy said:
RSX is easily more powerful than Xbox 360 GPU. But that CERTAINLY doesn't mean MS should just leave the market.
That depends on how you define power. The 360 gpu has a additional processor and memory store that also performs ops like AA, stencil shadows etc. It's not just about flops.
 
dorio said:
That depends on how you define power. The 360 gpu has a additional processor and memory store that also performs ops like AA, stencil shadows etc. It's not just about flops.

That logic is nothing out of the ordinary. If it wasn't on the eDram it'd be on the GPU. They've just moved it to a place that makes more sense (and save bandwidth in the process).

There's lots of stuff our floating point calculations above don't capture.
 
gofreak said:
There's lots of stuff our floating point calculations above don't capture.

Does that explain the 1 and 2 TFLOP performance numbers we saw Sony and MS put out there? :D
 
Nostromo said:
Shader ops are the EVIL, I told ya :lol

I think Extremetech are talking in terms of floating point ops though. And they seem to be saying 5 per ALU per cycle. Weird, no?

That other site seemed to say 4 per cycle per ALU.

Does that explain the 1 and 2 TFLOP performance numbers we saw Sony and MS put out there?

Basically, yeah.

edit - huh?

http://www.firingsquad.com/features/xbox_360_interview/default.asp

ATI: We have 48 shaders. And each shader, every cycle can do 4 floating-point operations, so that gives you 196. There’s a 192 number in there too, so I’m just going to digress a little bit. The 192 is actually in our intelligent memory, every cycle we have 192 processors in our embedded intelligent memory that do things like z, alpha, stencil. So there are two different numbers and they’re kind of close to each other, which leads to some confusion.

Err...so Xenos = 98Gflops? =/ (In terms of programmable shading power?)
 
That reminds me of "Saturn is more powerful than Playstation because it has more processors!" or whatever it was back in the day.

Anyway, I don't think Major Nelson is exactly the bastion of objectivity. It's almost like posting a page from Robbie Bach's blog about Xbox 360 being the most powerful.

EDIT: Sorry, Phatsaqs, I didn't know you were joking.
 
PhatSaqs said:

Dear God. He surely can't get away with that? It's blatantly misleading. I never laughed so much at a bunch of graphs in my life :lol

So, we can compare X360's "effective" bandwidth to 10mb edram with PS3's total physical bandwidth to 512MB of system memory now? :lol

The real comparison is this:

X360: 22GB/s to 512MB of RAM, 256GB/s "effective" to 10Mb.

PS3: 48GB/s to 512MB of RAM

When you realise X360 is spreading less than 10% of its bandwidth over 98% of its memory, things look a little different :p
 
Vennt said:
I don't know, when he starts off with the flawed "3 cores vs 1 core" argument it kinda puts the rest of his PS3 analysis into question don't you think?
Yeah. I dont know shit about tech but I knew that much just from looking at the GS comparison. Dude is funny :lol
 
PhatSaqs said:

despite the somewhat biased claims he makes, I'm curious as to whether or not they contain any truth. Especially the comments he makes about game performance, such as this:

Based on measurements of running next generation games, only ~10-30% of the instructions executed are floating point. The remainders of the instructions are load, store, integer, branch, etc. Even fewer of the instructions executed are streaming floating point—probably ~5-10%.

Is this true? I thought floating point operations were extroardinarily important? Or is he just referencing non-GPU work? Anyone with real game design experience care to address this?
 
gofreak said:
Dear God. He surely can't get away with that? It's blatantly misleading. I never laughed so much at a bunch of graphs in my life :lol

So, we can compare X360's "effective" bandwidth to 10mb edram with PS3's total physical bandwidth to 512MB of system memory now? :lol

Well, it's no less accurate than comparing "shader ops" in the long run, is it? ;) :D

Last gen:
Minor Scuffle blamed on Numbers, Technicians laugh & scorn & hand out the band-aids

Next gen:
Total all-out number Armageddon, Many casualties caused, Basic Maths among those listed MiA, Red Cross said to be overwhelmed and worlds numeric population decimated, end of the world stylee....

:P
 
Nerevar said:
Is this true? I thought floating point operations were extroardinarily important? Or is he just referencing non-GPU work? Anyone with real game design experience care to address this?

I think he's obviously going to pick figures that support his case...

When you look at some of the stuff there, and you consider he's a MS employee, I wouldn't exactly rate his credibility.
 
Izzy said:
NVIDIA confirmed that the RSX is features full FP32 support, like the current generation GeForce 6 as well as ATI's Xbox 360 GPU. NVIDIA did announce that the RSX would be able to execute 136 shader operations per cycle, a number that is greater than ATI's announced 96 shader ops per cycle.Link
That same website you qouted stated,

As we mentioned before, NVIDIA's RSX is the more PC-like of the two GPU solutions. Unlike ATI's offering, the RSX is based on a NVIDIA GPU, the upcoming G70 (the successor to the GeForce 6).
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423&p=3

RSX is nothing special.
 
Ghost of Bill Gates said:
That same website you qouted stated,



RSX is nothing special.

I think you have yourself mistaken for someone with credibility.
 
CrimsonSkies said:
I doubt the PS3 GPU will be better than the Xbox GPU. We will see though. And if it is then Microsoft should just leave the market.
You're not the only one who would be surprise...

As of right now ATI x360 GPU is better..but we'll see if NVIDIA can come up with something.
 
Ghost of Bill Gates said:
just because the facts dont agree with you doesn't mean I don't have credibility.

The only "facts" I have questioned is your credibility, my position is simply that companies are playing with some seriously wacky math & apples-to-potatoes comparisons for PR purposes that serve nobody other than gullible maroons such as yourself.

Or as I put it at Beyond3D...

"Lies, Damned Lies & Performance Metrics"

The fact that you fall for it all hook, line and sinker is infinitely amusing. :lol
 
There's not enough known about either GPU to make concrete arguments either way imho. It's one thing to speculate, but it's quite another to go around thinking you have all the facts in your back pocket. You don't.
 
Vennt said:
The only "facts" I have questioned is your credibility, my position is simply that companies are playing with some seriously wacky math & apples-to-potatoes comparisons for PR purposes that serve nobody other than gullible maroons such as yourself.

Or as I put it at Beyond3D...

"Lies, Damned Lies & Performance Metrics"

The fact that you fall for it all hook, line and sinker is infinitely amusing.

You don't know GoBG very well do you?...well i'm not going to hold that against you.
 
Top Bottom