• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

RSX pics/Next gen Nvidia card benchmarks/info..

realize, but I'm asking where his 10~14G pixel fillrate figure for RSX is coming from.
There was this leaked chart a month or so ago that had that number on it for G70. It was 10 or 11GP, I can't remember exactly (RSX number is probably just extrapolated somehow). Other stuff that was on that chart has so far been pretty much confirmed as true.
 
gofreak said:
Programmable power + non-programmable power.

We can work out the former, the latter is much harder to derive (if not impossible without a guided walkthrough from nVidia ;)).



Peak programmable performance, yes.

Non-programmable doesn't mean non-existant, of course.

okay, understood now.
 
Found this series of posts @ B3D intresting...


xbdestroya said:
What they're saying is that the Cell can work on stuff like physics, collisions, geometry, etc... and export the data to RSX. Also saying that due to the bandwidth offered between Cell and RSX, this sort of operation is encouraged rather than discouraged.

Rockster said:
What would RSX do with physics or collision data?

Alpha_Spartan said:
Let's imagine a boat on water. It looks crappy on current consoles because the water physics don't accurately respond to the object on it. With RSX having access to physics data, all kind of cool things can be done. Imagine the PS3 rendering waves tossing a ship around. The GPU is able to render this scene accurately in real-time according to the physical interaction between the ship and the waves. That's just one example I can think of.

Amazing, if true...
 
Shogmaster said:
Where's Duane getting 10~14GP for RSX vs 4GP of Xenos? If it's that chart, isn't it a poor source since it says 10GP for G70, 14GP for RSX, and 8GP for Xenos?

If the chart is right, then it's more of a 5:4 ~ 7:4 ratio then 10:4 ~ 14:4 ratio that Duane is insinuating.

Regardless, that chart and article is a speculation by a Chinese site, so we are getting worked up for something not so official, nor correct.
I thought the pixel draw rate was changed. My bad. Texel fillrate is as you state about. Pixel draw rate is 8.8GP for the "RSX" and 4GP for Xenos. In any case, 50% of 8.8 is still bigger than 90% of 4. And this is, of course, ignoring the quality of the pixels being drawn. Elegance/efficiency would have to be considerably in Xenos' favor for it to vault it in front of RSX. That's up in the air. But by the sample util figures the ATI guy gave, RSX still comes out on top.

This is assuming this chart is even real. I'm assuming at least some of it is, b/c the rest of the info seems to match with what we've gotten from NVidia so far. PEACE.
 
Kleegamefan said:
Found this series of posts @ B3D intresting...


Amazing, if true...

I find this more amazing, personally:

DemoCoder said:
G70/RSX supposedly supports sparse grid supersampling (according to Xmas), which is a very nice IQ boost to fix aliasing caused by pixel shaders which today is not helped by MSAA at all, and won't be fixed by alphatomask either.

If sparse grid supersampling is really supported, you could render-to-texture a supersampled HDR buffer, and use that to get HDR FSAA but at superior quality (of course, it ain't free)

Of course, the performance hit is probably severe. I'm also interested in seeing if G70\RSX can do fp16 HDR and AA, which the R520 supposedly can.
 
Yeah, I'm not holding my breath for widespread use of Anti-aliased Alpha textures on RSX either....

Fillrate hit must be HUGE, i would think....unless they have some kickass AA logic on RSX that doesn't use eDRAM??

Yeah, probably not :)
 
And that part is "only" clocked at 430Mhz...

RSX will have ~120Mhz boost plus god knows whatever other stuff.....they sure don't need to support alot of the legacy stuff 7800-series does in hardware, thats for sure...

Also, as Faf pointed out, GTX is doing some WMV9 or H.264 acceleration in hardware.....no need for that on RSX with 7 SPEs sitting there in CELL, waiting to rip through advanced audio/video codec duties (and better suited to it, I would think)...

Without question, RSX will be G70-ish, but expect some changes other than just a speed boost...
 
StoOgE said:
damn, PS3 is going to cost a fortune to make.

CELL $100
GPU $150
BD-ROM/DVD/CDROM DRIVE $100
256MB XRD $100
256MB DDR3 $100
Motherboard $30
Bluetooth controller $30
Wi-Fi $20
Gigabit Ethernet $20
Various USB/HDMI/AV Out/Memory Card IO Slots $20
Case $10
Power Supply $10
Cables/screws/manuals/packaging $10
------------------------------------------------------------------
Total Cost $700


Luckily the Xbox360 will be peanuts. The world will be balanced and everything in it's place. I say bring on the $500 PS3!
 
sonycowboy said:
CELL $100
GPU $150
BD-ROM/DVD/CDROM DRIVE $100
256MB XRD $100
256MB DDR3 $100
Motherboard $30
Bluetooth controller $30
Wi-Fi $20
Gigabit Ethernet $20
Various USB/HDMI/AV Out/Memory Card IO Slots $20
Case $10
Power Supply $10
Cables/screws/manuals/packaging $10
------------------------------------------------------------------
Total Cost $700
Please tell me it's sarcasm :)
 
I wasnt implying the 360 would be cheap.. though I do imagine the PS3 may wind up costing more.. the R&D on cell alone should ensure that.
 
Kleegamefan said:
And that part is "only" clocked at 430Mhz...

RSX will have ~120Mhz boost plus god knows whatever other stuff.....they sure don't need to support alot of the legacy stuff 7800-series does in hardware, thats for sure...

Also, as Faf pointed out, GTX is doing some WMV9 or H.264 acceleration in hardware.....no need for that on RSX with 7 SPEs sitting there in CELL, waiting to rip through advanced audio/video codec duties (and better suited to it, I would think)...

Without question, RSX will be G70-ish, but expect some changes other than just a speed boost...


I still don't think RSX will hit 550MHZ. Like I said in another thread, Nvidia is having a tough time getting a 450 mhz part out. It'll get downclocked before final form.
 
Perhaps, but RSX will be manufactured at a state of the art 90nm plant fabricated for said chip......same cannot be said about 110nm G70.....

IMO, RSX will be 500+MHz, if not 550.....
 
Mrbob said:
I still don't think RSX will hit 550MHZ. Like I said in another thread, Nvidia is having a tough time getting a 450 mhz part out. It'll get downclocked before final form.
I've seen the g70 overclocked to 530Mhz.
Without question, RSX will be G70-ish, but expect some changes other than just a speed boost...
Well of course you can expect changes to the inferface for flexio support etc. but if you're expecting alot more than that then I wouldn't get my hopes up if only not to be disappointed later.
 
Kleegamefan said:
Let's imagine a boat on water. It looks crappy on current consoles because the water physics don't accurately respond to the object on it. With RSX having access to physics data, all kind of cool things can be done. Imagine the PS3 rendering waves tossing a ship around. The GPU is able to render this scene accurately in real-time according to the physical interaction between the ship and the waves. That's just one example I can think of.

Fire_Bird said:
I'm sure some of us rememebr Blood Wake(or not, it was a boat warefare game), it was an Xbox Lunch title if not shortly after. Wave astetics were great, and I say astetics becasue while the GPU provided the power to produce resonable looking waves it did not interact with the world very well, the boats just never seemed to be sitting on the water right.

A resonable explaination could be after the GPU transformed the vertex infomation it was only transformed in the GPU memory not in the memory used by the collision detection code, remember the game has to know where the water (wave crest and stuff)starts because the boats have to float on the water right.

with MemExport the transformed Verticies infomation that is vital for proper collision detection could be updated by the GPU. making life less of a guess work. this goes for animation and any vetex infomation(defomation of an entire world that has been nuke)

http://forum.teamxbox.com/showpost.php?p=5465863&postcount=242

:D
 
If you get a card with good silicon I wouldn't be surprised if some G70s overclock over 550 mhz. But that doesn't mean all of them will or even most of them. 500 MHZ should be doable, but I think 550 is pushing it. Especially when Sony is going to want to have good yields to build X amount of consoles. We aren't talking about thousands anymore for a production run. We are talking millions.
 
I wasnt implying the 360 would be cheap.. though I do imagine the PS3 may wind up costing more.. the R&D on cell alone should ensure that.

It's hard to tell. I would think the PS3 will certainly cost more at first, but it could end up the cheaper one in the long run (or maybe on par) .

Sony has a couple advantages over MS. For Cell, the R&D costs are spread out not only between Sony, IBM, and Toshiba - but in several product lines. Its a scalable architecture intended to be used in practically anything that needs decent processing. PS3 may be one of the first items to use it, but it ain't going to be the last. Beyond the R&D costs, Sony should be getting Cell at cost I would think.

Similar future cost reductions exist for BluRay. The R&D will be spread between numerous product lines, Sony is on the receiving end of royalties, and they can produce it in-house at cost.
 
"Sony has a couple advantages over MS. For Cell, the R&D costs are spread out not only between Sony, IBM, and Toshiba - but in several product lines. Its a scalable architecture intended to be used in practically anything that needs decent processing. PS3 may be one of the first items to use it, but it ain't going to be the last. Beyond the R&D costs, Sony should be getting Cell at cost I would think. "

I dunno , i find this arguement a bit off. It may well be used in other products, but it's still going to be in 100 million PS3's - surely that would mean that the majority of Cell's cost HAS to be attributed to PS3 dev? I mean... do people think there'll be anyway neer as many units of other hardware that use cell?
 
Mrbob said:
If you get a card with good silicon I wouldn't be surprised if some G70s overclock over 550 mhz. But that doesn't mean all of them will or even most of them. 500 MHZ should be doable, but I think 550 is pushing it. Especially when Sony is going to want to have good yields to build X amount of consoles. We aren't talking about thousands anymore for a production run. We are talking millions.

Well, apparently 520Mhz has been reached with stock cooling, which is pretty impressive. If that's a general trend it seems quite clockable as is, but yeah, the guys who did that may have just got "good" samples to work with. Remember, though, RSX will be at 90nm vs 110nm too in addition.

Kleegamefan said:
Yeah, I'm not holding my breath for widespread use of Anti-aliased Alpha textures on RSX either....

The multisampling version, TMAA, should be virtually free apparently. But we shall see..

Where are the english reviews? By my reckoning they should have hit..

sangreal said:

The example given in the B3D thread wasn't great, but I think it was meant to illustrate the concept rather than act as a full illustration of everything PS3 can do. Obviously CPU calculating data and passing the GPU is nothing new, but it's the scale and depth of that communication that is different here.

edit - err, sorry, I see he was talking about memexport there. Well, I'm not sure how well collision detection would map to a GPU for starters..and either way, if you can do this on the CPU it'd be better since you'll be eating into rendering power (quite significantly) if you put stuff like this on the GPU, I think. If you have the power to spare and can map your algo to a shader, grand, but I'm not sure if many devs will want to sacrifice graphics for that. It's nice to have the choice, though.
 
dorio said:
I'm still not seeing why any of that can't be done at the cpu level before the vertices are sent to the gpu. Can someone give me a more detail explanation of why this would require something special from the hardware architecture.

Probably because the current CPU uses a much simplified physics simulation, rather than processing per vertex as is easy to do with GPU.

This can be most clearly seen in the idea of the bounding box which was part and parcel of many FPS until recently; hit detection was calculated against a 6 sided box scaled to the approximate size of the character, allowing shots going in between empty spaces of the character (like between the arm and torso) to be considered hits. Later games used more bounding boxes, more detailed bounding boxes representing arm, head, leg, etc areas.

It seems with the cell and other future modern processors, the ability to handle the large amount of vertex data and apply physics calculations to them are finally viable, and will probably prove to be just as big a graphical upgrade as the massive pixel shading capabilities available to the new GPUs. Accurate collision based IK system that may simulate calculations on not only bone but the muscles and skin of the character!
 
DCharlie said:
"Sony has a couple advantages over MS. For Cell, the R&D costs are spread out not only between Sony, IBM, and Toshiba - but in several product lines. Its a scalable architecture intended to be used in practically anything that needs decent processing. PS3 may be one of the first items to use it, but it ain't going to be the last. Beyond the R&D costs, Sony should be getting Cell at cost I would think. "

I dunno , i find this arguement a bit off. It may well be used in other products, but it's still going to be in 100 million PS3's - surely that would mean that the majority of Cell's cost HAS to be attributed to PS3 dev? I mean... do people think there'll be anyway neer as many units of other hardware that use cell?

Depends on the scope of applications that they've found for the CELL.

Theoretically it has a much wider market base than just the PS3, been able to scale up for anything that requires massive serial number crunching capabilities; which seems to be most super computer applications... which then in turn be applied to a wider variety of situations...

So eventually you can have things like soldiers generating realtime 3D maps of accurate topographical data on their PDA units.

On the otherhand, actually moving the CELL into place, keeping it updated and usable in the future is the real challenge; it seems like this is an architecture to stick with, allowing backwards and forwards scalability between different generations of the chip.
 
gofreak said:
Well, apparently 520Mhz has been reached with stock cooling, which is pretty impressive. If that's a general trend it seems quite clockable as is, but yeah, the guys who did that may have just got "good" samples to work with. Remember, though, RSX will be at 90nm vs 110nm too in addition.

Don't be so optimistic about that. These are brand new cards, read, they haven't been "out for long". These are clock rates that the GPU sustained with a heavy load over a short period of time with no visible artifacts. Anyone with notable over clocking experience knows that what they boasted as running perfectly with certain over clocks can just die the next week, the following week, etc... It is the risk they take for their one card with a single GPU they own (599 + sh or tax). It is definitely not a risk Sony/NVIDIA takes with 1 million GPUs inside 1 million PS3's on launch day with cost, cooling, power, and other limitations.

And who ever said "520 Mhz has been reached with stock cooling" is basically just spreading empty hype. Any respectible person would say "520 Mhz was sustained without any artifacts with certain temperatures for a given amount of time without burning out so far."



We'll know more once the cards are released and have been out for a good amount of time for people to play with.
 
599.99

:lol Stopped reading after that. Seriously, no wonder pc games are in such a shitty state. This plus the fact that I have to worry about bf2 running like crap on my system ruined my morning. This card should be 350 at most.. Hopefully this wil mean the 6800 gt and ultra are coming down in price.
 
Yeah, I'm not holding my breath for widespread use of Anti-aliased Alpha textures on RSX either....
That kind of AA is suspposedly free on G70, so I'd assume it would be used often, as it's really damn useful (it fixes one of the most annoying problems with today's grahcis IMO)

The sparse grid supersampling however will obviously have a fillrate hit of some kind (I have no idea how much as I have no clue how it operates).
 
Marconelly said:
That kind of AA is suspposedly free on G70, so I'd assume it would be used often, as it's really damn useful (it fixes one of the most annoying problems with today's grahcis IMO)

The sparse grid supersampling however will obviously have a fillrate hit of some kind (I have no idea how much as I have no clue how it operates).

Whoa :o I've always wondered why they can't do decent alpha channels on those textures, but it's not really needed with AA alpha channel textures...

Yay, more visual inconsistencies, which have always annoyed me have been fixed!
 
Elios83 said:
nVidia has updated its site with infos on GF7800:

http://www.nvidia.com/page/geforce_7800.html

There are smalls movies that illustrates the new features.

Wow thats really something. I hope we will see effects like that in PS3 games (subsurface scattering)

I saw that you can download the Luna video in 720p!

http://www.nvidia.com/page/geforce7_demos.html
http://www.nzone.com/object/nzone_luna_videos.html

screenshot1.jpg
 
Doc Holliday said:
599.99

:lol Stopped reading after that. Seriously, no wonder pc games are in such a shitty state. This plus the fact that I have to worry about bf2 running like crap on my system ruined my morning. This card should be 350 at most.. Hopefully this wil mean the 6800 gt and ultra are coming down in price.

Why would this cause PC games to be in a shitty state? Now you have the ability to run all the latest games at 4x the resolution of consoles with perfect 60fps framerate.
 
Wow thats really something. I hope we will see effects like that in PS3 games (subsurface scattering)


During E3, nVidia specifically made a big deal about subsurface scatter effects would be used in RSX....the Spiderman2 Alfred Molina demo also used SSS effects.....
 
seismologist said:
Why would this cause PC games to be in a shitty state? Now you have the ability to run all the latest games at 4x the resolution of consoles with perfect 60fps framerate.


because you cant play all the latest games
 
Now it's time for Konami to use all these effects to create a new Silent Hill with an absolutely perverse 'other world'.
 
sp0rsk said:
because you cant play all the latest games

does not compute

---------------------

And to the guy laughing above - some people have the cash and will pay even the most premium price for this card.

I know I would if I could.

BF2 in 1600x1200, 8xAA, 8xAF and 60 fps?

Online gaming doesn't get any better than that.
 
Amazing stuff. I want this. Dont really need it so much for current games however. :lol I need to think about what i an going to do with my pc.
 
Sounds like it gives ray tracing-like effects??

http://www.tomshardware.com/column/200506221/index.html

The other nifty stuff that the 7800 brings include the ability to do clever texture mapping tricks like lighting and pseudo normal calculation with just two 2D maps, known as relief mapping. This gives you really bumpy surfaces for walls and cobblestoned streets, with shadows that follow lights like they should - but at a much lower computational load than if you did real ray tracing (which we will be doing one day not too long from now.)
 
Elios83 said:
On the nVidia site they talk about Radiosity as one of the features.

It's not a "general" solution they're using there, from the brief description it sounds like they're following light from the walls out into the scene, but no further transfers. That'd be done within the shaders, of course, I don't think they've done anything to specifically accelerate that etc.

I wonder now if NVidia can talk about RSX or not. I've only read the B3D review which had no mention of it, but judging that Chinese site's review, they were able to reveal some info about it (assuming that chart came from NVidia). It'd be nice to have some RSX specifics..
 
Those demos look so damn impressive in 720p. I wish they'd have Nalu demo in that res too, and especially the Alfred Molina demo.
 
Dr_Cogent said:
:lol

$600 for a graphics card. No way in hell would you catch me spending that.


Looking at some of the benchmarks, it is certainly not worth it for certain games. Don't get me wrong, had this been during the 499.99 X800 XTPE and 499.99 6800 Ultra era then it would fall in place I guess... Then again the cheapest I can find these cards new are 420, 430, and 440 for the X800 XTPE, 6800 Ultra, X850 XTPE respectively. Even then those are too expensive when you can get a refurb AIWX800XT for 306. Best deal right now end of story on a video card considering the card hits XT PE speeds easily.


http://graphics.tomshardware.com/graphic/20050622/images/image041.gif
http://graphics.tomshardware.com/graphic/20050622/images/image043.gif
http://graphics.tomshardware.com/graphic/20050622/images/image045.gif
http://graphics.tomshardware.com/graphic/20050622/images/image039.gif
 
Top Bottom