• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Beyond3D Xenos Article

well, in all honnesty, i think some people are setting themselves up for a dissapointment. i mean yeah, next gen consoles might be able to render stuff like that demo square showed in real time, but its one thing to render thing to render a scene, and another to render an entire gameworld at that level of detail + do physics and ai

That thread, what was it called "post a pic that represents next gen to you" was just filled with stuff that was over the top imo.
 
rastex said:
I think what this also implies is that the 6-month time gap between X360 and PS3 can't necessarily be equated to a power advantage for PS3. From the examples you've pointed out, the trend doesn't always follow and thus it's not a very strong argument for PS3's power advantage (and time is really the main argument I've seen bandied about)

The X850 is overall a more powerful card, the point is it was coming 6 months later and was not a heck of a lot more powerful. The argument would be that it wasn't SO much more powerful, which would make it questionable how they could release a more powerful card six months earlier (which is where things tie back to what's happening with the consoles).

Yes, it could swing the other way too, as happened with the NV3x (although one could easily argue that the NV3x was anomalous, while the R4xx was not). I don't necessarily disagree with your point.

I would say time is one potential factor that could work to RSX's advantage, anomalous messups aside.

Hajaz said:
i mean yeah, next gen consoles might be able to render stuff like that demo square showed in real time, but its one thing to render thing to render a scene, and another to render an entire gameworld at that level of detail + do physics and ai

On the physics and AI point, they should not infringe on the potential for a game to output those kinds of visuals given that most of the demos were only using a fraction of the CPU power. I.e. you could have graphics like that and heavy physics and AI, I think. As for rendering entire game worlds with that level of detail..I'm not sure if that's so much a technical issue as it is one of art production.
 
Hey did you guys catch this in the current EGM?... It was a really small screenshot so it doesn't blow up very well... but for what it's worth....

xb360_vs_xb1.JPG
 
MetalAlien said:
Hey did you guys catch this in the current EGM?... It was a really small screenshot so it doesn't blow up very well... but for what it's worth....

xb360_vs_xb1.JPG

As Omnigamer said, that comparrison shot was shown here before the issue came out. Also, it had no need to be blown up then because it was clear and without words mucking up the comparrison. Also, there was another comparrison shot with Kameo standing in front of a castle.

Both are not particularly effective, imho.

1116499727.jpg


If the image can't be hotlinked, click here

1116499728.jpg


If the image can't be hotlinked, click here
 
Amir0x said:
As Omnigamer said, that comparrison shot was shown here before the issue came out. Also, it had no need to be blown up then because it was clear and without words mucking up the comparrison. Also, there was another comparrison shot with Kameo standing in front of a castle.

Both are not particularly effective, imho.

1116499727.jpg


If the image can't be hotlinked, click here

1116499728.jpg


If the image can't be hotlinked, click here


The first one shows a huge leap, though big AA issues (should be cleaned up now that the Beta kits are out). Bottom one is not as dramatic.
 
I wouldn't classify either as dramatic. They both feature what can be described as a "substantial" leap, but it's just seems like the baseline for what we should be expecting next-gen so it's not as impressive. I hope that once they use the other 70% of the systems power (olol) that it comes through more potently.
 
Amir0x said:
I wouldn't classify either as dramatic. They both feature what can be described as a "substantial" leap, but it's just seems like the baseline for what we should be expecting next-gen so it's not as impressive. I hope that once they use the other 70% of the systems power (olol) that it comes through more potently.

Well, two comments. First, I agree that neither is "substantial". Second, I said "more dramatic", meaning relative to the first. Just saying the jump was better in one than the other.

And I agree that the shots are not up to what should be expected from next gen, though some later shots and the scale in some scenes are closer.
 
GhaleonEB said:
Well, two comments. First, I agree that neither is "substantial". Second, I said "more dramatic", meaning relative to the first. Just saying the jump was better in one than the other.

No, I said both shots ARE substantial, just not dramatic. I mean when you break it down, there's an awful lot of improvements across the board being shown in these images. The first jump was better than the other by far, though.

GhaleonEB said:
And I agree that the shots are not up to what should be expected from next gen, though some later shots and the scale in some scenes are closer.

I think I have to get an appreciate for the scale of the world before I really pull final judgment on just how "next-gen" it is. That said, I don't like the trend (for next-gen games) that feel lots of enemies equals impressive. From a technical standpoint this may be true, but it doesn't make character models or art direction any less ugly, ya know?
 
Amir0x said:
I think I have to get an appreciate for the scale of the world before I really pull final judgment on just how "next-gen" it is. That said, I don't like the trend (for next-gen games) that feel lots of enemies equals impressive. From a technical standpoint this may be true, but it doesn't make character models or art direction any less ugly, ya know?


Agreed there. I think developers found that the easiest thing to do is take existing models and put LOTS AND LOTS OF THEM in the same scene. That's certainly the case with Kameo and that hilarious looking zombie game, with five models repeated a hundred times each in the same scene. I suspect it will be a 1st generation gimmick, and then the power will begin to be tapped a little more effectively.

That said, one of the things I expect from this gen is a big increase in scale. But it's gotta be more than just a copy paste of this gen's models.
 
Wow, you guys are picky. What exactly would make the game look next gen to you guys aside from the aliasing? The textures look great, I don't see alot of stray polys, the lighting and shadowing look good and there's plenty of grass :) .
 
Amir0x said:
As Omnigamer said, that comparrison shot was shown here before the issue came out. Also, it had no need to be blown up then because it was clear and without words mucking up the comparrison. Also, there was another comparrison shot with Kameo standing in front of a castle.

Both are not particularly effective, imho.

1116499727.jpg


If the image can't be hotlinked, click here

1116499728.jpg


If the image can't be hotlinked, click here


Aww cool, yea I'm sure there are some others like me, that even with scouring the board, still missed those.
 
"I don't think it's right to assume the PS3 will be twice the 360, but I don't think it's right to assume that they're gonna be even in power either. Cell already drubs the XeCPU. Ignoring bandwidth questions, RSX has a transistor and clock speed advantage, as well as a time advantage. RSX doesn't have to be massively more powerful than Xenos to be a hell of a chip. Just for an example, let's assume NVidia figured out how to make 64/128bit HDR useable at good framerates. That could translate to the lighting we saw in most of those PS3 demos. Most of what was seen in those videos was largely due to great lighting. There's so much debate about how much those videos represent the PS3 game graphics, but if we were to get that level of performance, I think it would constitute a noticeable step up in graphics, no? Until we know what RSX is packing, the graphics advantage could be anywhere from marginal to monumental. I think it can go either way. "


I don't believe it has a transistor advantage. Also there have been reports of the G70 producing a lot of heat at the 90nm process so it remains to be seen if they're going to sucessfully reach the 550Mhz clockspeed number for the RSX. And I don't know about the RSX but according to that article HDR lighting and displacement can be done in only ONE pass on the Xenos. All in all it's still to early to compare the two. It will be too early until we see some benchmarks because efficiency is going to play a huge role here.
 
jimbo said:
I don't believe it has a transistor advantage.

Not in terms of total logic, but there may be one in terms of computational logic.

jimbo said:
Also there have been reports of the G70 producing a lot of heat at the 90nm process so it remains to be seen if they're going to sucessfully reach the 550Mhz clockspeed number for the RSX.

G70 is being manufactured at 110nm.

edit - just saw a report cropped up saying it was being manufactured at 90nm, but 110nm has been reported everywhere else. Even at 90nm, this is the GTX version, so there should be room for upward clocking, I should hope so anyway!
 
My gut feeling says the 550Mhz target for RSX was more of last minute one upmanship than anything else. I wouldn't be surprised if one or both of the GPUs fail to meet their target goals. It has happened MANY times before, most recent being the NV2A (250Mhz to 233Mhz) and Flipper (200Mhz to 162Mhz).

It's all about yielding to the yields. (har har)
 
Shogmaster said:
My gut feeling says the 550Mhz target for RSX was more of last minute one upmanship than anything else. I wouldn't be surprised if one or both of the GPUs fail to meet their target goals. It has happened MANY times before, most recent being the NV2A (250Mhz to 233Mhz) and Flipper (200Mhz to 162Mhz).

It's all about yielding to the yields. (har har)

Also about tying into the processor and bus speeds. Flipper *and* Gekko dropped I think due to needing to keep the multiples.

I don't know how 3.2GHz/550MHz matches up (nearly a 6x multiplier), but yeah, Sony will reserve wiggle room.

Considering none of the current top end chips run at that speed, it'd be impressive if they make it in consoles. (although maybe the fact that Sony can use 90nm and the PC card guys are still on 130/110 would help)
 
Shogmaster said:
My gut feeling says the 550Mhz target for RSX was more of last minute one upmanship than anything else. I wouldn't be surprised if one or both of the GPUs fail to meet their target goals. It has happened MANY times before, most recent being the NV2A (250Mhz to 233Mhz) and Flipper (200Mhz to 162Mhz).

It's all about yielding to the yields. (har har)

I'll be more disappointed if Xenos falls to under 500 MHz.

it won't look as bad if RSX drops below 550 MHz but still at or above 500 MHz. :)
 
mrklaw said:
Also about tying into the processor and bus speeds. Flipper *and* Gekko dropped I think due to needing to keep the multiples.

Actually, Flipper dropped and Gekko went up (400Mhz to 485Mhz). Even though the spin at the time was that devs felt that GC benefitted more from a faster CPU than the GPU, I think it was ultimately due to Flipper yields failing to meet the low MSRP Nintendo set up for the GC. Flipper is much bigger and complex than the Gekko, so I think it makes sense that they had yield problems with Flipper rather than Gekko.

I don't know how 3.2GHz/550MHz matches up (nearly a 6x multiplier), but yeah, Sony will reserve wiggle room.

Shit, I'm still a littel foggy on how the mulipliers are set up for the X360 (3.2Ghz CPU and 500Mhz GPU = 32X multiplier for the CPU and 5x multiplier for the GPU? And the FSB = 1Ghz? Doesn't sound right somehow).

Considering none of the current top end chips run at that speed, it'd be impressive if they make it in consoles. (although maybe the fact that Sony can use 90nm and the PC card guys are still on 130/110 would help)

Well, the rumor is abound that the G70 is actually fabbed on 90nm instead of 110... and it's gonna be faster than 430Mhz as was rumored.
 
mrklaw said:
Also about tying into the processor and bus speeds. Flipper *and* Gekko dropped I think due to needing to keep the multiples.

Gekko's clock was *raised* from 405 MHz to *485* MHz, while Flipper's clock dropped from 202.5 MHz to 162 MHz. the Gamecube's bus speeds and bandwidth figures also dropped in relation to Flipper's drop

edit: oh maybe you were saying that Gekko's *bus* speed dropped?
 
I don't know if it was posted, but here's where the suggestion that the G70 is a 90nm part comes from:

http://www.penstarsys.com/editor/company/nvidia/g70_spec/g70_spec_2.htm

The article would appear to have been changed since it was originally posted. It originally had this in it (a little RSX talk too):

At the J.P. Morgan technology conference, NVIDIA gave a 15 minute presentation with a short Q&A. Marv Burkett, CFO of NVIDIA, gave the presentation. Most of the presentation talked about NVIDIA's current financial position, how their products are positioned in the market, and how well certain aspects of the business are growing (GPU's and MCP's being the main growth areas). He also went on to state that while the Consumer Electroncis group (those in charge of products like the X-Box) will have very flat growth until around Q3, when they will start receiving income from the RSX (PS3 graphics unit). Their WMP (Wireless Media Products) division had a big customer last year, but that has since dried up. However, they are expecting two major customers to come on board next quarter, so that area should be shored up.

In his talk he covered quite a few topics, and some of the bigger ones were that of the RSX and 90 nm products. Currently the RSX is still in development, and no actual silicon is available as of yet. Looking at Sony's timeline, I would expect the RSX to be taped out by the end of this Summer, and that first silicon will be available in late Fall. Once all the little problems are fixed and the design is working as it should, Sony will take over production and pay NVIDIA a royalty for the use of their technology. While overall revenue from this deal will be lower than the X-Box, NVIDIA will not have to worry about things such as production schedules, poor yields, and the other pitfalls of handling the production portion of a GPU. This will of course have a positive effect on net profits though, since this will essentially be "free money" from work previously done. Sony has laid out a good chunk of change for the current design work, and I would imagine that delivery of first silicon will be faster than I am quoting because Sony owns and runs the Fab that the silicon will be produced on (without having NVIDIA pay out the waazoo for an accelerated first run, you can expect Sony to give that product top priority in its Fab).

The demos that were running at E3 were apparently mainly running on SLI machines, as well as G70 parts. Marv talked about how these demos were run on an upcoming product with many similar capabilities as the RSX chip. So, while the RSX will have more features that are aimed at the PS3, we can expect this next generation of cards to nearly match the overall performance and feature-set of the RSX.

Now for the confusion. Earlier this year at a conference call with Jen-Hsun and the gang, it was stated that the first 90 nm parts were going to be introduced this Fall. Now we are hearing something different. At the J.P. Morgan conference, Marv Burkett clearly stated that the first 90 nm part will be introduced this quarter (which definitely cannot be characterized as "Fall"), and that all "large" products will be 90 nm from here on out. This suggests, in very strong language, that the G70 will be 90 nm (as it has not been released as of yet, and it is a large part). So, was the leak last week legitimate? If Marv really meant what he said, then no, the G70 will not be a 110 nm part.

Went on to suggest that NVidia might be throwing out misinformation re. G70 to throw ATi off the scent. Who knows, I guess we'll see within a week...I think the G70 leaked info is pretty believable tbh.

edit -
dorio said:
This is off topic but why do you have a 360 avatar gofreak since you're so pro-Sony?

I guess it doesn't appeal to your need to pigeonhold, huh? ;) Seriously though, I may have a liking for what Sony are doing technically with PS3, but that doesn't mean I don't like X360. I've had this avater since pretty much the first X360 pics hit the web, it may be time for a change soon.
 
here is what seems to be the older and larger version of that article

Mass Confusion about NVIDIA's G70

And Other Parts

by Josh Walrath



Last week some of the first good looking information on the G70 from
NVIDIA was leaked. Now, this info pointed towards the G70 being a 110 nm
part clocked at 430 MHz and it featured 24 pixel pipelines (six quads), and
gave some other pertinent information. The materials leaked with the specs
also made it look like it was legitimate. Now, I am just not so sure.

At the J.P. Morgan technology conference, NVIDIA gave a 15 minute
presentation with a short Q&A. Marv Burkett, CFO of NVIDIA, gave the
presentation. Most of the presentation talked about NVIDIA's current
financial position, how their products are positioned in the market, and how
well certain aspects of the business are growing (GPU's and MCP's being the
main growth areas). He also went on to state that while the Consumer
Electroncis group (those in charge of products like the X-Box) will have
very flat growth until around Q3, when they will start receiving income from
the RSX (PS3 graphics unit). Their WMP (Wireless Media Products) division
had a big customer last year, but that has since dried up. However, they
are expecting two major customers to come on board next quarter, so that
area should be shored up.

In his talk he covered quite a few topics, and some of the bigger ones
were that of the RSX and 90 nm products. Currently the RSX is still in
development, and no actual silicon is available as of yet. Looking at
Sony's timeline, I would expect the RSX to be taped out by the end of this
Summer, and that first silicon will be available in late Fall. Once all the
little problems are fixed and the design is working as it should, Sony will
take over production and pay NVIDIA a royalty for the use of their
technology. While overall revenue from this deal will be lower than the
X-Box, NVIDIA will not have to worry about things such as production
schedules, poor yields, and the other pitfalls of handling the production
portion of a GPU. This will of course have a positive effect on net profits
though, since this will essentially be "free money" from work previously
done. Sony has laid out a good chunk of change for the current design work,
and I would imagine that delivery of first silicon will be faster than I am
quoting because Sony owns and runs the Fab that the silicon will be produced
on (without having NVIDIA pay out the waazoo for an accelerated first run,
you can expect Sony to give that product top priority in its Fab).

The demos that were running at E3 were apparently mainly running on SLI
machines, as well as G70 parts. Marv talked about how these demos were run
on an upcoming product with many similar capabilities as the RSX chip. So,
while the RSX will have more features that are aimed at the PS3, we can
expect this next generation of cards to nearly match the overall performance
and feature-set of the RSX.

Now for the confusion. Earlier this year at a conference call with
Jen-Hsun and the gang, it was stated that the first 90 nm parts were going
to be introduced this Fall. Now we are hearing something different. At the
J.P. Morgan conference, Marv Burkett clearly stated that the first 90 nm
part will be introduced this quarter (which definitely cannot be
characterized as "Fall"), and that all "large" products will be 90 nm from
here on out. This suggests, in very strong language, that the G70 will be
90 nm (as it has not been released as of yet, and it is a large part). So,
was the leak last week legitimate? If Marv really meant what he said, then
no, the G70 will not be a 110 nm part.

The amount of confusion that NVIDIA has spread about their products in
the past two years in terms of leaks has been pretty astonishing. Nobody
has a handle on what is going to be introduced, and while the big picture is
fairly well known, the details are not. We all know that the next gen of
products will have a faster clockspeed, and that they will feature at least
24 pixel pipelines. Other than that, it is a lot of guesswork. Now, one
noted hoax that NVIDIA perpetrated was that of hinting the NV40 was a 8x2
architecture. Apparently NVIDIA delivered "special" cards to some
developers that showed up as 8x2, and of course this information was leaked
to the internet community, and ATI was able to see what was going on. At
this point ATI thought they were sitting pretty with their X800 Pro and X800
XT PE. A 12 pixel pipeline card running at 475 MHz should just destroy a
8x2 architected 350 MHz part. Of course the XT PE would wipe the floor with
the competition. Then April rolled around last year and we saw that the
NV40 was a 16 pipeline design, the 6800 GT was significantly faster than the
X800 Pro, and the 6800 Ultra matched the X800 XT PE. As we saw, ATI had to
introduce the X800 XT near the end of Summer of last year to be able to
offer a part more competitive with the NVIDIA range of cards (and gave users
something between the middling performance of the X800 Pro and the
outstanding performance of the X800 XT PE). Unfortunately for ATI, they had
some serious supply issues, and their XT and XT-PE parts were very hard to
find.







Throughout the past 5 months we have been hearing many conflicting reports
about what the G70 will be. If Burkett is giving us a true glimpse (which I
think he is), then we can speculate on what we can expect to see. First off
the G70 will be 90 nm (and not the 110 nm that we were all expecting), and
it will probably be clocked significantly higher than the 430 MHz that the
leaked presentation documented. We can also expect a part that is around
300 million transistors. Depending on how NVIDIA has allocated those
transistors, I think we will see a minimum of 24 pixel pipelines. There has
been a lot of talk about the possibility of 32 pixel pipelines, but I just
don't know if that will happen. My conservative nature says no, but it is a
distinct possibility that there could be essentially 32 pixel pipelines. I
think we will also see a new multi-sampling unit that will be able to handle
HDR content (unlike the current unit). Other things such as PureVideo will
of course be included, and we will probably see a couple of new wrinkles.
The "GT" version of this part could be clocked around 450 MHz, while the
"Ultra" edition of this part will probably be 500 MHz+. Power consumption
will still be around 6800 Ultra levels.

With that out of the way, we can move onto the fun stuff! Now, this is
all speculation as essentially NOTHING of the other G7x products has been
leaked. I have a feeling that with the overall success of TSMC's 90 nm
process (which is apparently very, very healthy) we can expect to see NVIDIA
phasing out its NV40/41/45/48 chips. These are very large at 130 nm, and
are not as cost effective as they once were. I feel that there is going to
be a large turnover in the $250 to $400 range with a new set of products.
The NV43/44/44a will continue to address the low end to the $200 market, but
the large 130 nm NV4x parts will soon be replaced by smaller, more cost
effective 90 nm parts. I think we will see some true competition to ATI's
110 nm X800 series (the X800 and X800 XL). The new series of 90 nm products
will feature the same pixel pipeline design of the G70, and all of the
optimizations that entails. If my speculation is correct then the low end
90 nm part will be a 12 pixel pipeline product running between 450 MHz to
500 MHz. This will compete with the X800, and from past indications on per
clock performance of the NV4x architecture, this should be faster than the
X800, yet still be priced around the $249 level. The next step up will be a
full 16 pixel pipeline design running around 500 MHz. This will compete
with the X800 XL in price, but will of course be faster. If this product
does in fact exist, and is sold around the $299 mark, then it could
seriously be the best bang for the buck that we have seen since the X800 XL.
This leaves room for one more product.

A G7x part with 16 pixel pipelines and running at 600 MHz would exist
at the $350 to $400 price range. This part would of course spank all of the
current high end cards (6800 Ultra, X800/X850 XT PE), yet be offered at a
lower price point. While this card would be very fast, it will still not be
able to compete with the high end G70 parts priced at $450 and above. A
massive move to 90 nm would give NVIDIA a pretty solid segmentation of
products, and allow them to stop their 130 nm production of large parts.
The only real question here is what will happen to the 110 nm NV42? Would
NVIDIA be better off keeping that part and moving it down to the $200 price
point and keep the 6200 and 6600 parts at sub-$175? Or will the 6600 GT
still be the best part at just under $200 and phase out the NV42? My gut
feeling is that NVIDIA will stop production on the NV42, as it honestly
gives about the same overall performance as the cheaper 6600 GT. So, by the
end of this summer, NVIDIA will only be producing 110 nm NV43/44/44a and the
90 nm G7x parts.

Again, much of this is speculation based on comments by Marv Burkett,
as well as some other small leaks and info that is floating around. When
ATI released their R300 in the form of the 9700 Pro, and NVIDIA was left
sitting with the aging GeForce 4 Ti series to compete with this product and
the NV30 had not seen the light of day, Jen-Hsun challenged his people to
match ATI, and he essentially said, "This is war!" If NVIDIA is continuing
with that philosophy, then we can expect to see a lot more disinformation on
coming products, and the smoke will get amazingly thick. The only thing we
shouldn't do is underestimate NVIDIA. It is a very aggressive company, and
their engineering talent is seriously second to none. Hopefully ATI will
have taken this challenge seriously, and we can expect to see some
impressive parts from them as well. The R520 does not look to be a slouch,
but hopefully ATI has not been lulled into complacency with the rumors that
the G70 is a lower clocked 110 nm part.
 
Here is the first picture available I believe of Xenos...

http://www.beyond3d.com/#news24076

c1.jpg


As our "ATI Xenos: XBOX 360 Graphics Demystified" article investigates, the graphics processor for the XBOX 360 console is split into two elements, the main parent "shader core" (manufactured by TSMC at 90nm) which handles most of the graphics operations and the ALU arrays for processing shader programs and a secondary, daughter die (manufactured by NEC at 90nm) which handles the all the sample operations (colour read/blend/write, Multi-Sample AA, Z operations, etc.) and a fast dedicated 10MB of eDRAM that acts as the processors primary frame buffer, that has 256GB/s of bandwidth available to it. We've been supplied with an image of the chip package for the Xenos graphics processor:

The parent/daughter die is quite clearly evident on the same package from this image. While the 232M transistor figure for the parent was given to us by ATI we are still trying to establish a more official figure for the daughter (even though these things are very much estimates). We've speculated that the 150M figure that appeared when XBOX 360 was first announced may just relate to daughter die, however another figure that has arisen is 100M - judging from the die sizes the daughter die doesn’t have more than half the area of the parent, which would give indications towards the 100M side although 80M of those transistors are DRAM which may be more dense than the logic circuitry that will dominate the parent die. We are trying to get further clarification.
 
Top Bottom