• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The latest Far Cry 1.2 Benchmarks: (SM 2.0b ATI support)

Izzy

Banned
As you see, the impact of geometry instancing is colossal: 24% performance boost in a scene with truly extreme geometry load is something terrific. It is important to note that without geometry instancing all contemporary graphics cards seem to be CPU-dependant in 1024x768 and 1280x1024 resolutions, which means that on a processor less powerful than AMD Athlon 64 3400+ there will be even higher performance advantage from geometry instancing in case of hardcore geometry load.



research_1024_pure.gif

research_1280_pure.gif

research_1600_pure.gif



Both RADEON X800-series and GeForce 6800-series got the slightest of the speed increases because of “long” pixel shaders they calculate instead of multitude “short” pixel shaders.

Neither ATI RADEON X800 XT nor the RADEON X800 PRO could outperform the competing offerings from NVIDIA, not talking about the RADEON 9800 XT that remains galaxy behind the GeForce 6800.
 

ManaByte

Gold Member
not talking about the RADEON 9800 XT that remains galaxy behind the GeForce 6800.

Uh duh? That's like saying the SNES remains a galaxy beind the N64. Why even compare a previous gen card to a next gen card in such a way?
 

Tenguman

Member
ManaByte said:
Uh duh? That's like saying the SNES remains a galaxy beind the N64. Why even compare a previous gen card to a next gen card in such a way?
Because the card that's a generation behind is more expensive by almost $100
 

Izzy

Banned
seismologist said:
6800 series looks like it's owning everybody. Plus doesn't it have more features too?

Agreed. GF 6800 GT is hands down the best high end card. With SM 3.0 capability and this level of performance, it's a steal at its price. I'm definitely buying it(once it becomes available).
 

Izzy

Banned
With Shader Model 2.0b render path available now in a commercial product and functioning flawlessly ATI Technologies sends a clear message to gamers: excellent performance the RADEON X800 XT and the RADEON X800 PRO deliver now can be improved in future. Like with the case of the Shader Model 3.0 performance advantage provided today is not really huge, however, future titles that use more per-pixel lighting as well as geometry with higher complexity are unlikely to turn into something that the RADEON X800-series cannot bite.

Even though ATI considerably improved performance of the RADEON X800-series graphics cards in FarCry’s “Pure Mode”, NVIDIA’s GeForce 6800 Ultra still can claim performance leadership here, as in quite a lot of cases speed delivered by the GeForce 6800 Ultra is a bit higher compared to the rivalling RADEON X800 XT.

The same situation is effective for a bit less expensive graphics options – the GeForce 6800 GT and the RADEON X800 PRO – 16 pixel pipelines definitely help the former to beat the latter in quite a lot of cases.



Unfortunately, the RADEON X800 PRO still lags behind the competing GeForce 6800 GT product, whereas the RADEON 9800 XT does not seem to be a really strong rival for the GeForce 6800, at least, in FarCry.

In spite of expectations, the GeForce FX 5950 Ultra did not gain any performance boosts with the commercial 1.2 patch, which means that the whole GeForce FX family of products cannot be recommended for FarCry as they are not going to deliver solid performance with premier image quality, the main peculiarity of the game.

Well, that about wraps it up for this patch.
 

Vandiger

Member
Well this just confirms that the 6800 GT is the best valued card of this current generation. You'd be a fool to get the ultra considering GT can overclock to ultra speed plus uses only one molex, I think the only reason to wait is maybe for the PCI Express version and being able to SLI ala 3dfx style. The only regret of buying the card is that I didn't jump sooner to take advantage of the BB or Buy.com preorder :p
 

Tenguman

Member
trippingmartian said:
Everyone's always saying to get a 9800 pro but I don't even see it on the benchmarks. What's with?
Because it'll run even slower than the 9800xt. If it's that slow, why bench it?
 

Buggy Loop

Gold Member
The benchmarks shows it, last gen it was ATI's 9800 over Nvidia's FX. This gen its 6800 > x800.

Its called competition, you know, something we hadnt seen since voodoo 2/3 vs tnt2
 

Mareg

Member
Wow, and no more then a month ago, ATI was thrumpeting Far Cry as being an exemple of the Radeon X800 supperiority fo this generation fo GPU !

Talk about a letdown.... HL2 has better be the savior of ATI, else... (I might actualy switch camp)
 

Izzy

Banned
Mareg said:
Wow, and no more then a month ago, ATI was thrumpeting Far Cry as being an exemple of the Radeon X800 supperiority fo this generation fo GPU !

Talk about a letdown.... HL2 has better be the savior of ATI, else... (I might actualy switch camp)

Well, I don't know about that....check out preliminary Half Life 2 benches:


dx1sf_1024_pure.gif

dx1sf_1280_pure.gif

dx1sf_1600_pure.gif
 
Um, wow. Talk about selective benchies. Jesus, did anyone bother to check out other benchies from this same report? I guess it's hard when a link to site isn't even provided. :rolleyes:

Look i can too!

training_1024_pure.gif


training_1280_pure.gif


training_1600_pure.gif


At high resolutions, w/ lots o' stuff going on, and w/ AA and AF enabled (a.k.a How you TEST and STRESS a card), the X800 series is still king.

training_1024_candy.gif


training_1280_candy.gif


training_1600_candy.gif


Oh, look more!!!

research_1024_candy.gif


research_1280_candy.gif


research_1600_candy.gif


There's no doubt Nvidia got it's stuff together this generation. And it definetly is the cardmaker to go w/ if Doom3 is your main reason for purchasing a new card. But it's getting tiring combating all the Nvidiot spin, bullshit, misleading benchies, etc. Jesus, that part of Nvidia is FULL BLOWN this gen.

Edit- WTF is the point of testing these "next-generation" cards if you're going to test at 1024x768 w/ no AA or AF? That is what Xbitlabs "pure speed" setting is.
 

XMonkey

lacks enthusiasm.
Izzy said:
Well, I don't know about that....check out preliminary Half Life 2 benches:

Hey, I know, lets run some tests on the absolutely unoptimized piece of software known as the leaked HL2 Alpha (or Beta if you prefer...) and act like it totally means something!
 
volcano_1024_candy.gif


volcano_1280_candy.gif


volcano_1600_candy.gif




pier_1024_candy.gif


pier_1280_candy.gif


pier_1600_candy.gif



pierhigh_1024_candy.gif


pierhigh_1280_candy.gif


pierhigh_1600_candy.gif


Benchmarks in “Eye-Candy” mode revealed nothing new: overwhelming advantage of the RADEON X800 XT over all the other graphics cards in FarCry became even more indisputable because of moderate performance boost achieved as a result of more efficient rendering paths. Nevertheless, the boost is not enough to drive the RADEON X800 PRO to the leading position among $399 graphics cards in FarCry.

It is important to point out that ATI’s RADEON X800 XT and PRO graphics cards handle extreme geometry load better compared to NVIDIA’s GeForce 6800 Ultra and GT, which may mean that from this point ATI’s visual processing units have higher future-proof than NVIDIA’s latest graphics processing units do.

Additionally, ATI’s RADEON X800 XT traditionally calculates complex pixel shaders faster than everything else available today. Unfortunately, the RADEON X800 PRO still lags behind the competing GeForce 6800 GT product, whereas the RADEON 9800 XT does not seem to be a really strong rival for the GeForce 6800, at least, in FarCry.

I will say though, ATI's got nothing on Nvidia in the $399 market, and the GT seems to be a much better value then it's Ultra counterpart. There are rumors though that ATI's cooking up a 16-pipe variant of the X800 XT that'll be underclocked, meant to replace the PRO as direct competitor to the GT.

Time will tell.
 

Shompola

Banned
wow what a big letdown the X800 PRO is no matter how you look at it. And that card is supposed to be priced the same as 6800 GT right?
 
So has anyone got their hands on the X800 XT yet? The only version released that I know of is the Xtasy brand (sold out in many places).
 

Andy787

Banned
Wow. Thank you so much for actually looking through the article and presenting that, Sal Paradise. I was really started to think I made the wrong decision ordering an X800XT, as I've been seeing a lot of spin for Nvidia lately after the Doom 3 article at HardOCP, despite the X800XT killing the 6800U earlier on in pretty much every other game benchmarked on the very same site. Those benchmarks with the full FSAA and AF make me very, very happy. I was wondering why those weren't posted in the first place, since those are what really matter, as it's what is actually showing how much the video card can be pushed, as opposed to those "pure speed" benches which are all but completely reliant on the CPU. I mean that's what they're made for in the first place.

Benchmarks in "Eye-Candy" mode revealed nothing new: overwhelming advantage of the RADEON X800 XT over all the other graphics cards in FarCry became even more indisputable because of moderate performance boost achieved as a result of more efficient rendering paths. Nevertheless, the boost is not enough to drive the RADEON X800 PRO to the leading position among $399 graphics cards in FarCry.

It is important to point out that ATI's RADEON X800 XT and PRO graphics cards handle extreme geometry load better compared to NVIDIA's GeForce 6800 Ultra and GT, which may mean that from this point ATI's visual processing units have higher future-proof than NVIDIA's latest graphics processing units do.

That's all I wanted to hear.
 

tenchir

Member
Shompola said:
wow what a big letdown the X800 PRO is no matter how you look at it. And that card is supposed to be priced the same as 6800 GT right?

So I assume you are never going to use FSAA/AF ever? The X800Pro/XT has been doing better performance wise than 6800GT/Ultra in a lot of games(especially with AA/AF on) and suddenly it became a really bad buy when it doesn't do as well in FarCry with no AA/AF enabled or 5-15 (depending on AA/AF)fps behind 6800 at already good framerate in Doom 3?
 

Badabing

Time ta STEP IT UP
Whenever I turn on AA with my Radeon 8500, my games lag. But my computer, other than my Radeon, is pretty fast. 2.8ghz Processor, 1 gig of Dual Channel RAM...

I don't get it. Games like Counter-Strike even run laggy.
 

tenchir

Member
Badabing said:
Whenever I turn on AA with my Radeon 8500, my games lag. But my computer, other than my Radeon, is pretty fast. 2.8ghz Processor, 1 gig of Dual Channel RAM...

I don't get it. Games like Counter-Strike even run laggy.

Had the same problem when I had a 8500, it suddenly disappeared when I installed Catalyst 4.3 drivers. After that, I bought a 9800 Pro, so I don't know if the problem is still there will 8500+4.4 and up. Try checking Rage3d forum.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Looks like Nvidia won this round, which is great for us. We don't want one company to win all the time or they're won't be competition.

Last generation, Ati beat Nvidia silly. While Nvidia isn't as far ahead as ati was last time, at least they are ahead.

I guess I shouldn't be too suprised. The X800s are just 9800s tweaked while 6800s are brand spanking new.
 

AntoneM

Member
the only problem I currently have with ATI is that I never know if the x800's are being benched with optimizations off. I know that while playing I wouldn't notice it and in that respect ATI wins. I just want to know which card will perform the best with the most stuff and no optimizations. Anyway, as said above with all the bells and whistles ATI wins again, but Nvidia really did their shit together and the GT is a great buy.
 

Andy787

Banned
teh_pwn said:
Looks like Nvidia won this round, which is great for us. We don't want one company to win all the time or they're won't be competition.

Last generation, Ati beat Nvidia silly. While Nvidia isn't as far ahead as ati was last time, at least they are ahead.

I guess I shouldn't be too suprised. The X800s are just 9800s tweaked while 6800s are brand spanking new.
Huh? Look again. Nvidia still lost. :p
 

FightyF

Banned
I wish they would list the average retail price besides each card...it makes the comparison much easier. I googled each card name in there, and saw 5 different prices for each :p

It's all very interesting though...and I plan to wait until HL2 is out before making a decision.

Thanks for sharing the info guys (and for showing both sides of the issue)!
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Whoops, didn't see the others.

Wow, what a fanboy. Selecting only certain benches to entice people to buy their favorite company's card. Not cool. Not cool at all.
 
So as long as I keep AA&AF off my 6800GT will be faster than a x800xt?

Are AA&AF really needed? The HardOCP Doom 3 benchmarks seem to indicate that it's not.
 
Sal Paradise Jr said:
Um, wow. Talk about selective benchies. Jesus, did anyone bother to check out other benchies from this same report? I guess it's hard when a link to site isn't even provided. :rolleyes:

Look i can too!

training_1024_pure.gif


training_1280_pure.gif


training_1600_pure.gif


At high resolutions, w/ lots o' stuff going on, and w/ AA and AF enabled (a.k.a How you TEST and STRESS a card), the X800 series is still king.

training_1024_candy.gif


training_1280_candy.gif


training_1600_candy.gif


Oh, look more!!!

research_1024_candy.gif


research_1280_candy.gif


research_1600_candy.gif


There's no doubt Nvidia got it's stuff together this generation. And it definetly is the cardmaker to go w/ if Doom3 is your main reason for purchasing a new card. But it's getting tiring combating all the Nvidiot spin, bullshit, misleading benchies, etc. Jesus, that part of Nvidia is FULL BLOWN this gen.

Edit- WTF is the point of testing these "next-generation" cards if you're going to test at 1024x768 w/ no AA or AF? That is what Xbitlabs "pure speed" setting is.



1tao5a_awro-lui%20was%20owned.jpg

owned!!!111!.jpg
 

Mareg

Member
seismologist said:
So as long as I keep AA&AF off my 6800GT will be faster than a x800xt?

Are AA&AF really needed? The HardOCP Doom 3 benchmarks seem to indicate that it's not.
Doom 3 and FarCry are different story all together. Doom 3 is designed as so it doesn't need AA so badly. FarCry do benifit from some AA. But, for me, AA is just a waste of GPU, and I prefer cranking the resolution up and putting some AF (8 is a great setting).

To think of it, its the default setting in Doom3 for HighQ. (0AA 8AF).
 
seismologist said:
Nvidia is still winning in my eyes. Also The 6800 series is more future proof with SM3.0 support.

Um, what about Nvidia's cards are more future proof? SM 3.0? The ONLY worthwhile feature that is going to be used extensively from that set is geometry instancing. It's a neat little trick that can reduce workloads for the renderer by a high percentage on complex scenes.

ATI cards have this feature on the R3xx and up. What part of better geometry processing and better pixel shader processing don't people understand???

It states right there in this report (wouldn't be the first one) the ATI cards are going to be better suited for future apps. Unless of course a game is hand-tailored (literally) for a Nvidia card.

So now AA/AF aren't really important? Gotcha. *checks off list*
 
Sal Paradise Jr said:
So now AA/AF aren't really important? Gotcha. *checks off list*

Well it's a trade off. According to these benches, even on the ATI card, enabling AA/AF cuts your framerate in half.

Hard to say if it's worth it. Especially in newer games where your struggling to stay above 30fps.
 
Well, i don't think anyone can argue that there's always going to be a trade off when talking about picture quality/IQ versus speed. It's always trying to find the perfect balance. And those needs are going to be different for every inidividual based on their preference.

Some people prefer to play games at 1024 with 4xaa/8xaf. Others prefer gaming at 1600x1200 with maybe just a touch of AF.

The point is you can't accurately measure a cards performance and future potential going the way Xbitlabs goes about in their "Pure Speed" mode. It's something i've always had an issue with. The same goes for their "DX9" :rolleyes: benchmarks that is really just the leaked HL2 code.

It's hard to tell which sites are being objective anymore. All i know is currently I trust no one other than Beyond3d. They have the most knowledgeable forum concerning graphics on the web. Great site, and they are always clear on how they go about testing.
 

Pimpbaa

Member
marsomega said:
No, the X800 cards use an AF optimization that can't be turned off.

No, the X800 (and 9600) have trilinear optimizations that can't be turned off. AF optimizations (adaptive AF) have been in ati cards since the 8500.
 

Phoenix

Member
I really wish they would start doing their benchmarks based on price point. I could give a rats ass about how fast a $500-$700 card performs because there is a 0% chance that I am dropping that kind of money on a video card - and I have the money to do it, I simply refuse to spend that kind of money on something with that type of depreciation. That has got to be the worst return on investment in the universe!
 

rastex

Banned
Hey Sal,

Thanks for divulging the secret of the "Pure speed" setting. When these benches came up before I asked what the Pure speed setting was and nobody responded. At that point I knew there was something fishy going on. For the $400 price point the 6800GT seems like the card to get, and for the top-end the x800XT is the way to go. Sounds good to me.
 

Izzy

Banned
Sal Paradise Jr said:
Um, what about Nvidia's cards are more future proof? SM 3.0? The ONLY worthwhile feature that is going to be used extensively from that set is geometry instancing. It's a neat little trick that can reduce workloads for the renderer by a high percentage on complex scenes.

ATI cards have this feature on the R3xx and up. What part of better geometry processing and better pixel shader processing don't people understand???

It states right there in this report (wouldn't be the first one) the ATI cards are going to be better suited for future apps. Unless of course a game is hand-tailored (literally) for a Nvidia card.

That's definitely not true. Let me help you out with a description of some of the more worthwile SM 3.0 features:


-One of the more major upgrades in Shader Model 3.0 is the addition of Vertex Texture Lookups. What this allows is features like Displacement Mapping. If there is going to be any major difference in image quality comparing Shader Model 3.0 to 2.0 it is going to be with the use of Displacement Mapping. Bump mapping which is currently used now to give the appearance of height in textures is just an illusion. There is no physical difference in the texture, meaning if you look at the texture from the side or dead on you will see that it is still flat, only from far away does bump mapping work. Even then it isn’t the best option since the texture is physically still flat light and shadows do not reflect correctly. The answer is Displacement Mapping which physically adds surface detail by manipulating the height of the texture. Displacement Mapping can even go as far as to create the model itself. Displacement Mapping may be a huge boon to adding realism in games. If developers pick up on this technology and we see it implemented in games, this right here could be the deciding feature that shows the most difference between a game rendered in Shader Model 3.0 and a game rendered in Shader Model 2.0.


-Dynamic Branching is also a new feature that Shader Model 3.0 has that Shader Model 2.0 does not. Dynamic Branching gives the ability for programmers to control the actual flow of the program, starting and stopping code where they see fit instead of a straight execution from the beginning line of the shader program to the end. What this would allow is faster shader performance.

Hope that helps, Sal.
 
Top Bottom