• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

RSX question...

Doube D

Member
ok, so I keep hearing that the RSX is basically a 7800 OCed by ~100 mhz. Really? Well then could someone tell me wtf are the engineers at nvidia doing with the extra 8 months of dev time on their hands?? That is longer than a typical pc graphics card generation, yet all they are doing is OCing it?? There has to be something more to it correct?

btw, when is the expected tap out date?
 
They really haven't said more about RSX except clock speed and how 7800 level tech is being used in it. It's just assumption of how much RSX is going to be like 7800.

I'm in the same boat as you, it has got to be more than some simple overclocking 7800. For one thing all that video media capabilities is taking up pointless space in the 7800 since the Cell can do all of that and then some, so that's probably out.
 
RSX is Sony's chip, not NVidia's. For all we know, NVidia might not be doing anything else with the part right now. The dot-product figures, as well as the shader ops described at E3 suggest it at least has the same pipeline configuration as the G70. But there's really no reason for them to still be working on a chip that's been done since @ March.

There's also the issue of the 30-40M extra transistors that aren't needed, since Cell will most likely handle all video processing tasks. There's stuff they can add that won't alter the rendering pipeline at all. Post-processing stuff, or (my personal favorite) large on-chip buffers to keep the external bus free. Extra hardware for emulation is another. There are a number of options Sony has at their disposal. I'd like to see a tile buffer, just b/c it would actually allow them a chance to do all that HDR+AA shit that some think is so vital. But we'll probably have to wait until November or CES in January for details on the chip. The GS wasn't announced until long after the EE. I don't think there's a reason to show off RSX at Hotchips or whatever upcoming semiconductor show there is. PEACE.
 
Doom_Bringer said:
how do you know?

You know, based on how Nvidia handled Xbox's GPU I wouldn't be surprised if it featured some newer tech that would come in later cards. Like I think the Geforce 3 it was based on had only 1 vertex shader while the Xbox's GPU had 2, so I guess it would have been like a Geforce 3.5.
 
Simply moving to 90nm isn't trivial, it would take a little engineering time.

I believe it's due to tape out about now if it hasn't already, recently.

It seems very likely it's a similar config to G70 but at 550Mhz, although apparently a few of the numbers disagree a little with that. I'd need to look that up to be sure though.
 
Top Bottom