okay get this (for those of you who have not read it) in the Feb XBN, on page 20, they wrote:
"insiders expect ATI's R500 to promise 10 times the polygon power and four times the pixel power of ATI's Radeon X800 XT chip-in other words, around 5 billion triangles per second and a 30 billion pixel per second fill rate, give or take some billions. With those sorts of numbers, we're not sure whether to laugh or just sit back and smile"
5 billion polygons/triangles and 30 billion pixel fillrate---what the hell?
Now, if you go back and look at the leaked Xenon document, which has been called fairly accurate by some of those actually working on its games (arguably) you will see it lists Xenon's performance at around 500 million polygons/triangles and 4 billion pixel fill rate. there's a seemingly huge difference between these two sets of specifications.
which one is right--I mean, which one is likely to be close to the truth?
"insiders expect ATI's R500 to promise 10 times the polygon power and four times the pixel power of ATI's Radeon X800 XT chip-in other words, around 5 billion triangles per second and a 30 billion pixel per second fill rate, give or take some billions. With those sorts of numbers, we're not sure whether to laugh or just sit back and smile"
5 billion polygons/triangles and 30 billion pixel fillrate---what the hell?
Now, if you go back and look at the leaked Xenon document, which has been called fairly accurate by some of those actually working on its games (arguably) you will see it lists Xenon's performance at around 500 million polygons/triangles and 4 billion pixel fill rate. there's a seemingly huge difference between these two sets of specifications.
which one is right--I mean, which one is likely to be close to the truth?