• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

RSX pics/Next gen Nvidia card benchmarks/info..

Marconelly said:
G70 and RSX numbers in that table don't include any texture ops either, so that evens the comparision, I guess.
That depends on whether texture ops or still coupled to the shader ones as it was previously. In the xgpu, they are not.
 
Marconelly said:
G70 and RSX numbers in that table don't include any texture ops either, so that evens the comparision, I guess.

Well, it might.

Someone on B3D has said that G70 decouples texture addressing from the shader, but the general thought up to now is that it didn't. Without decoupling, texture addresses would still take an ALU op from the shader.

The good news is that texture addresses aren't particularly frequent in long shaders which should be the norm next gen. So a lot of the time, the pixel shaders will have all that logic to itself. The downside of texture decoupling is, AFAIK, that even if the texture address units aren't needed, they can't double up as shader ALUs in a similar fashion. That said, I'd definitely take texture decoupling in RSX's design, since there seems to be no shortage of ALUs in its shaders.
 
Anyway, it's nice to begin to see where NVidia's RSX numbers at E3 were coming from.
Ok, pop quiz - G70 has a bunch of video decoding crap on board that is completely useless for PS3. What happens to those transistors:

a) Nvidia has monkeys working on this so they keep it on the chip doing nothing whatsoever.
b) the space is used to stick a GS inside or GS compatibility extensions :P
c) That Japanese blog is right and RSX has hw-tesselation (useless I know, but PSP has a bunch of useless features in GPU too).
d) Some kind of sound assist (hw decoder for many ATRAC channels or something along those lines) - (I'd kinda like this if we don't have it anywhere else in the system).
e) Something completely different (lemon curry*?)

*I'm high on caffeine and sleep withdrawal, so forgive me.

Edited to add option for sound assist :P
 
I hope we'll get an answer tomorrow, or shortly, Faf. The chart suggests they'll be able to talk about RSX from tomorrow onwards, so hopefully any interviews with NVidia beyond this will sort it out, if they don't volunteer that info themselves.

I've a feeling the video logic might still be there, somehow. Don't know why, just a hunch.
 
CrimsonSkies said:
Wow Microsoft got a raw deal. They get a GPU that's less powerful than ATI's next cards core. But magically nVidia delivered a more powerful GPU for the PS3 than their next cards core. :lol

Well duh, the ATI team with talent is working on the Rev GPU.
 
Fafalada said:
Ok, pop quiz - G70 has a bunch of video decoding crap on board that is completely useless for PS3. What happens to those transistors:

a) Nvidia has monkeys working on this so they keep it on the chip doing nothing whatsoever.
b) the space is used to stick a GS inside or GS compatibility extensions :P
c) That Japanese blog is right and RSX has hw-tesselation (useless I know, but PSP has a bunch of useless features in GPU too).
d) Some kind of sound assist (hw decoder for many ATRAC channels or something along those lines) - (I'd kinda like this if we don't have it anywhere else in the system).
e) Something completely different (lemon curry*?)

*I'm high on caffeine and sleep withdrawal, so forgive me.

Edited to add option for sound assist :P
I'm confused. The video decoding won't be used for outputting RSX video whereas that function is handled separately on the 360?
 
2005-6-21-16-10-14-654986702.gif



so if RSX' real floating point performance figures are 44 Gflops + 356 Gflops and we add in Cell's 218 Gflops, that gives PS3 about 618 Gflops total, in non-bullshit, peak theoretical performance?
:)
 
Chiggs said:
Tell that to Doube D; just check out his comments in this thread from last night:

http://new.ga-forum.com/showthread.php?t=52554

Oh, and by the way, Doube, if you're reading this - I was right.


LOL. What a joke. So let me get this straight. This article is BS (paraphrasing your own words) but they got the RSX specs down right?? All they did was upgrade the clock rate of the G70 and mark it as the RSX which is the same crap b3d idiots have been doing forever now (that's called SPECULATION FRIEND < NOT FACT). But, on the other hand, if we are to take what this article is saying for granted, then the RSX is bitch slapping the xenos regardless... so which is it? ;p
 
Doube D said:
LOL. What a joke. So let me get this straight. This article is BS (paraphrasing your own words) but they got the RSX specs down right?? All they did was upgrade the clock rate of the G70 and mark it as the RSX which is the same crap b3d idiots have been doing forever now (that's called SPECULATION FRIEND < NOT FACT). But, on the other hand, if we are to take what this article is saying for granted, then the RSX is bitch slapping the xenos regardless... so which is it? ;p

Here we go......

We've switched from discussion on aspects mode to full fledged troll mode.......
 
Tenacious-V said:
Here we go......

We've switched from discussion on aspects mode to full fledged troll mode.......

Here we go nothing... He posted a response directed @ me, I posted one back. get over it or move on
 
Doube D said:
Here we go nothing... He posted a response directed @ me, I posted one back. get over it or move on

Is anyone else in this thread as hostile as you??? Hellz no, why not calm down. Keep this as a discussion instead of a troll war. You so eagerly want to veer this thread off course, if you want to fight over crap keep it in PM's. Don't mess up this thread. The rest of us want to keep discussing and solving issues with this.
 
dorio said:
Can some one explain where the 1.9 teraflop numbers come from?

Programmable power + non-programmable power.

We can work out the former, the latter is much harder to derive (if not impossible without a guided walkthrough from nVidia ;)).

midnightguy said:
so if RSX' real floating point performance figures are 44 Gflops + 356 Gflops and we add in Cell's 218 Gflops, that gives PS3 about 618 Gflops total, in non-bullshit, peak theoretical performance?

Peak programmable performance, yes.

Non-programmable doesn't mean non-existant, of course.
 
Tenacious-V said:
Here we go......

We've switched from discussion on aspects mode to full fledged troll mode.......

That's a little hypocritical don't you think considering you wrote this over at OA 'JesusgotmeKFC'

http://*************/forums/index.php?showtopic=34758

-jinx- Gaf Mod/sony Whore Getting Owned Right No
LOL, Dunno why I'm posting this but I'm just laughing at how much he's getting called out on all his shit at GA. It's about fucking time. He's such a blatant Sony cock sucker, he'd make a good boyfriend to tsp. Bastards stole my thread......

http://ga-forum.com/showthread.php?t=52528&page=2&pp=50
 
Tenacious-V said:
I knew it!! I knew it was just a G70 modded to PS3 inputs. So long ago, nobody believed me.... They didn't state RSX stats because it would have compromised the G70 launch and fundamentally given ATi free spec sheets months early. But now that G70 is officially launched, RSX is as well.....

I love though, on those charts how they use top of the line NV parts, but compare it to an X800XL...... Scared much???? Put it up against the X850 XT PE, for an exen comparison.

Also for those of you thinking it'll blow away the Xenos, take into fact the efficiency of the unified pipes. Xenos will be more likely to hit it's theoreticals than RSX will. As well as the 4xAA hit RSX will take as well, if they decide to employ AA at all. Basically XBox 360 is coming out 6 months in advance and will hold it's own pretty damn good, I can even say you'll probably see no difference in games.

You talk about starting troll wars when THIS was your first post in this thread??? lol, ok buddy
 
Doube D said:
You talk about starting troll wars when THIS was your first post in this thread??? lol, ok buddy

Explain to me how that was trolling?? I stated what was true. RSX is G70 but higher clocked. And the chart compared it with X800XL which is true as well. nVidia purposely avoided using ATi's top of the line card for a comparison. And my statement beneath it is the same. I never stated anything negative against PS3 in that post, just that XBox 360 will hold it's own. The only thing I did was posted that reply in a "I was right" attitude cause I was happy cause I knew it.

You're the one bringing in all the bitch slap talk.
 
Tenacious-V said:
Explain to me how that was trolling?? I stated what was true. RSX is G70 but higher clocked. And the chart compared it with X800XL which is true as well. nVidia purposely avoided using ATi's top of the line card for a comparison. And my statement beneath it is the same. I never stated anything negative against PS3 in that post, just that XBox 360 will hold it's own.

You're the one bringing in all the bitch slap talk.

How was it trolling? For one, you state conjecture and claim it to be fact. You say the RSX is a higher clocked G70 (which is pure unsubstantiated wishful thinking). If you got white papers then post them, otherwise curb the BS. If you are using the posted chart as the basis for your claims, then don't come back claiming they posted X800 specs for comparison (cause what it says is R500/xenos). You don't like it? Too bad. You can't have your cake and eat it too. Either the chart is correct or false. To take the info you want out of it, claim it to be fact, and mince the rest to garner credit for your brand of tech is a well established trolling tactic.
 
Looks like the 7800 is a pixel Shading monster, but the Vertex shading is weak in comparison. The X850XT PE is actually better at vertex shading.

050618tom06.jpg


Looks like if any work will be offloaded to CELL it will most likely be vertex shading. Think they had that planned? Techies here predicted CELL would do vertex load, and now 7800 is a PS monster with relatively weak VS.
 
gofreak said:
Programmable power + non-programmable power.

We can work out the former, the latter is much harder to derive (if not impossible without a guided walkthrough from nVidia ;)).



Peak programmable performance, yes.

Non-programmable doesn't mean non-existant, of course.
Thanks, so 1.4 of the flops power is non-programmable. What type of things would that encompass? Things like z-ops, AA etc.?
 
Tenacious-V said:
Looks like the 7800 is a pixel Shading monster, but the Vertex shading is weak in comparison. The X850XT PE is actually better at vertex shading.

050618tom06.jpg

Two things..

a) 3DMark apparently has limitations re. vertex work. I'd have to look into it for specifics but that's the "word" being bandied about.

b) The G70 is probably CPU bound, whereas the X850 is not. It'd be interesting to stick them on faster CPUs and see where they end up. (This is probably the issue in many of the other benches where the G70 doesn't significantly outperform that X800 btw).

So I'm not sure how accurate a reflection that is of its actual vertex performance.

dorio said:
Thanks, so 1.4 of the flops power is non-programmable. What type of things would that encompass? Things like z-ops, AA etc.?

Everything that isn't running on the CPU or shaders. So texture addressing (in X360's case, anyway, in PS3's if texture addressing is decoupled), texture ops, AA, z-ops..yeah. Basically everything else that is computational. But it isn't counted in the same way as programmable power, I'm sure.
 
Tenacious-V said:
Looks like the 7800 is a pixel Shading monster, but the Vertex shading is weak in comparison. The X850XT PE is actually better at vertex shading.

050618tom06.jpg


Looks like if any work will be offloaded to CELL it will most likely be vertex shading. Think they had that planned? Techies here predicted CELL would do vertex load, and now 7800 is a PS monster with relatively weak VS.
I guess that's in line with the increase in pixel shaders 8 vs. vertex shaders 2?
 
Doube D said:
How was it trolling? For one, you state conjecture and claim it to be fact. You say the RSX is a higher clocked G70 (which is pure unsubstantiated wishful thinking). If you got white papers then post them, otherwise curb the BS. If you are using the posted chart as the basis for your claims, then don't come back claiming they posted X800 specs for comparison (cause what it says is R500/xenos). You don't like it? Too bad. You can't have your cake and eat it too. Either the chart is correct or false. To take the info you want out of it, claim it to be fact, and mince the rest to garner credit for your brand of tech is a well established trolling tactic.

If you wanna keep biting me, PM me, I'm not veering this thread off course.

1) Months ago nVidia stated RSX was a modification of their upcoming next gen graphics card.
2) G70 unveiled and instantly RSX is right beside it. The answer being specs are the same (save for clockspeed) and nVidia did not want ATi to have their next gen cards specs months in advance.
3) The posted chart is accurate, it was released by nVidia. My concern with the Xenos info I brought up earlier. If you look up a bit instead of lashing out at me, you'd see it was adressed by some of the members on the board.
4) If you think I have a well established troll tactic think again.
5) If you want to keep lashing out at me, PM me, I don't want this thread fudged over.
 
gofreak said:
Two things..

a) 3DMark apparently has limitations re. vertex work. I'd have to look into it for specifics but that's the "word" being bandied about.

b) The G70 is probably CPU bound, whereas the X850 is not. It'd be interesting to stick them on faster CPUs and see where they end up. (This is probably the issue in many of the other benches where the G70 doesn't significantly outperform that X800 btw).

So I'm not sure how accurate a reflection that is of its actual vertex performance.



Everything that isn't running on the CPU or shaders. So texture addressing, texture ops, AA, z-ops..yeah. Basically everything else that is computational. But it isn't counted in the same way as programmable power, I'm sure.

a) wouldn't 7800 VS still max out whats available even with 3DM05 being limited?
b) related to (a) I guess, but shouldn't it still be higher? Even if it was limited by program or CPU, the VS should be more advanced than X850, wouldn't it be a little higher though? To reflect what advantages it does have?
 
Tenacious-V said:
Looks like the 7800 is a pixel Shading monster, but the Vertex shading is weak in comparison. The X850XT PE is actually better at vertex shading.

050618tom06.jpg


Looks like if any work will be offloaded to CELL it will most likely be vertex shading. Think they had that planned? Techies here predicted CELL would do vertex load, and now 7800 is a PS monster with relatively weak VS.


Could be...

BTW, even though USAs can run at near 100% efficiency, they are not as fast as dedicated pixel shaders....

http://www.atomicmpc.com.au/article.asp?SCID=14&CIID=22720&p=2

That being the case, we will probably need to benchmark X360 vs. PS3 to come to any real conclusions...

I must say I also fall into the group that doesn't necessarily sees RSXs G70 relation as inherently bad???

Seems like a pretty kickass part to me :/

Fafalada said:
Both CPUs can cooperate with the GPU nicely (it's been heavily advertised on both sides as well).
That said, the way I see it, XeCPU isn't even in the same league as Cell when it comes to this kind of processing.

Yeah, the way CELL<->RSX intergration was discribed seems really impressive to me :D
 
Tenacious-V said:
a) wouldn't 7800 VS still max out whats available even with 3DM05 being limited?

No, not necessarily at all. It can only deal with what 3DM05 sends it.

Tenacious-V said:
b) related to (a) I guess, but shouldn't it still be higher? Even if it was limited by program or CPU, the VS should be more advanced than X850, wouldn't it be a little higher though? To reflect what advantages it does have?

Again, I don't know the ins and outs of how 3DM05 works, but if there is an issue with vertex work, it may just be that the X850 is a better fit for how it handles things vs G70. I don't think it's a reflection on its actual performance.

More generally, the CPU has a big impact with certain games. It can hold a card back in a major way, and I think that's probably the case with the G70. Not so with the X800 or X850, really. Beyond a certain point, it doesn't really matter how fast the GPU can do its work if the CPU is becoming the bottleneck..that's why if you plugged in faster CPUs I think you'd start seeing the difference (if not in 3Dmark, in some of the other benches certainly). It's the same reason why it's not worth buying cutting edge graphics cards if your CPU is somewhat old..the CPU will only hold it back, and you may be as well off with an older (cheaper) card.
 
gofreak said:
No, not necessarily at all. It can only deal with what 3DM05 sends it.



Again, I don't know the ins and outs of how 3DM05 works, but if there is an issue with vertex work, it may just be that the X850 is a better fit for how it handles things vs G70. I don't think it's a reflection on its actual performance.

More generally, the CPU has a big impact with certain games. It can hold a card back in a major way, and I think that's probably the case with the G70. Not so with the X800 or X850, really. Beyond a certain point, it doesn't really matter how fast the GPU can do its work if the CPU is becoming the bottleneck..that's why if you plugged in faster CPUs I think you'd start seeing the difference (if not in 3Dmark, in some of the other benches certainly). It's the same reason why it's not worth buying cutting edge graphics cards if your CPU is somewhat old..the CPU will only hold it back, and you may be as well off with an older (cheaper) card.

Gotcha, thanks for clearing that up.
 
gofreak said:
More generally, the CPU has a big impact with certain games. It can hold a card back in a major way, and I think that's probably the case with the G70. Not so with the X800 or X850, really. Beyond a certain point, it doesn't really matter how fast the GPU can do its work if the CPU is becoming the bottleneck..that's why if you plugged in faster CPUs I think you'd start seeing the difference (if not in 3Dmark, in some of the other benches certainly). It's the same reason why it's not worth buying cutting edge graphics cards if your CPU is somewhat old..the CPU will only hold it back, and you may be as well off with an older (cheaper) card.

Yeah, I wonder if PS3/X360 will be GPU bound or CPU bound?

If they turn out to be GPU bound (and I would think that at least PS3 will be) then YEEEEE-HAW! lets get ready for some .5B and up poly counts! :D
 
Kleegamefan said:
Yeah, I wonder if PS3/X360 will be GPU bound or CPU bound?

If they turn out to be GPU bound (and I would think that at least PS3 will be) then YEEEEE-HAW! lets get ready for some .5B and up poly counts! :D
You're quite the optimist. :)
 
Doube D said:
LOL. What a joke. So let me get this straight. This article is BS (paraphrasing your own words) but they got the RSX specs down right?? All they did was upgrade the clock rate of the G70 and mark it as the RSX which is the same crap b3d idiots have been doing forever now (that's called SPECULATION FRIEND < NOT FACT). But, on the other hand, if we are to take what this article is saying for granted, then the RSX is bitch slapping the xenos regardless... so which is it? ;p


Your "paraphrasing" has greatly misconstrued what I wrote (surprise!). Sorry, but that's not even close to what I said about the article (which was basically a critique of the Xenos specs provided by Nvidia and the apples to oranges comparison- not the RSX specs, or even the article itself, really).

But lets get back to the topic on hand: You argued with me last night about how nobody outside of Nvidia and Sony knew what the RSX was, and I argued that pretty much anyone following it should know that it would be very similar to the G70 part from Nvidia. Of course, you shit all over this, and said "GIVE ME NAMES, QUOTES, ETC!!!!". And that's fine if you don't want to deal with speculation, I understand completely - but I was still right and you were still, now that time has given us a better perspective on things, wrong.

And yes, the RSX certainly appears to be "bitch-slapping" the Xenos on paper (which I've never argued against - it's coming out after the 360, so it should be more powerful, right?); but the big question is: Will the much-vaunted ALU's in the Xenos achieve their theoretical maximum efficiency (which is what we were talking about last night)? The architecture of these two systems are very different, so the apples to oranges comparisons are a little stupid - it's not going to be that black and white. But as far as overall raw system power? PS3 > Xbox 360 - is this really even surprising? And stop trying to use it as an insult at me; it's not like you're hurting my Xbox 360 feelings. Like you said yourself last night, just because ATI made a different design decision this time around, doesn't mean it's the "right" decision.
 
Fafalada said:
Ok, pop quiz - G70 has a bunch of video decoding crap on board that is completely useless for PS3. What happens to those transistors:

a) Nvidia has monkeys working on this so they keep it on the chip doing nothing whatsoever.
b) the space is used to stick a GS inside or GS compatibility extensions :P
c) That Japanese blog is right and RSX has hw-tesselation (useless I know, but PSP has a bunch of useless features in GPU too).
d) Some kind of sound assist (hw decoder for many ATRAC channels or something along those lines) - (I'd kinda like this if we don't have it anywhere else in the system).
e) Something completely different (lemon curry*?)

*I'm high on caffeine and sleep withdrawal, so forgive me.

Edited to add option for sound assist :P

f) A huge array of 6502 cores.
 
Why are people saying that chart was made by nVidia?
Why would they release a chart in chinese first? Makes no sense to me.
Sony's PS3 main engine used based on G70 the overhead construction design RSX graph core, nVIDIA official has produced G70 and the RSX detail requirements contrast for the first time, meanwhile has provided the R500 graph core reference data which Xbox used.
A rough translation from where the image was made:
http://www.hardspell.com/hard/showcont.asp?news_id=14372&pageid=3105
Sounds like they based RSX specs on the G70.

Anyway its good that the G70 isn't the slouch *many* were predicting, and wouldn't be bad news to PS3 fans regardless.
 
Doube D said:
http://www.xtremesystems.org/forums/attachment.php?attachmentid=32827&stc=1

oh btw chiggs, Im tired of this debate regarding the RSX = G70. If we have an official release or statement from nvidia stating what you claim, fine, i was wrong. Until then, we will have to wait and see. The chart up there is CLAIMED by some chinese site to be from nvidia. I'll wait for nvidia to come out and say it.


Fair enough. The NDA is up either tonight or tomorrow. We should have a lot more info shortly.
 
Doube D said:
http://www.xtremesystems.org/forums/attachment.php?attachmentid=32827&stc=1

This real?

oh btw chiggs, Im tired of this debate regarding the RSX = G70. If we have an official release or statement from nvidia stating what you claim, fine, i was wrong. Until then, we will have to wait and see. The chart up there is CLAIMED by some chinese site to be from nvidia. I'll wait for nvidia to come out and say it.
Yep, but its overclocked.
 
On the one hand, I'm pretty annoyed that RSX really may not be more than a pumped-up G70. I was really expecting better use of those trannies, and a concerted effort to try for something different, even slightly.

OTOH, G70/RSX is a powerful GPU regardless. And as I've said in the past, there's no reason RSX should be the weaker part, eventhough it doesn't try for unified shaders. 50-70% of 10-14GP is still greater than 90% of 4GP.

I still would have liked to have seen more PS pipes or something. I really, REALLY hope the rumors of 128bit blends is true. Even if it's not used that often, it would (a) be a change from the G70 and (b) offer some real possibilities to show off some of those advanced effects. So far so good. RSX > G70, and G70 is already laying the smackdown. NVidia hasn't disappointed...yet. PEACE.
 
Marconelly said:
R500 fillrate is 16gigasamples = 4gigapixels (GP in his post, I presume)


Yep, I misread the whole "GP" thing. Anyhow, it should be interesting to see how AA affects the performance of the RSX.
 
Where's Duane getting 10~14GP for RSX vs 4GP of Xenos? If it's that chart, isn't it a poor source since it says 10GP for G70, 14GP for RSX, and 8GP for Xenos?

If the chart is right, then it's more of a 5:4 ~ 7:4 ratio then 10:4 ~ 14:4 ratio that Duane is insinuating.

Regardless, that chart and article is a speculation by a Chinese site, so we are getting worked up for something not so official, nor correct.
 
Kleegamefan said:
Isn't Xenos 4 GigaPixels and 16 Anti Aliased GigaSamples??

At any rate....here is another 7800GTX 3DMark score for ya:

attachment.php
Wow, what's the conditions of that score besides the overclocking?
 
Top Bottom