My god.
You do understand that all this talk about Crysis 2 having bad tesselation is a direct result of one company's agenda behind it, right? The only thing that has been proven by wireframe shots is that there is no culling in wireframe mode. When you play the game in a normal mode geometry culling makes sure that stuff that isn't visible isn't rendered at all. As for the pieces of scenery with high triangle density -- even if it's not that visible sometimes it costs about 1 fps on a GTX560. Now re-read what I've posted earlier.Maybe not but there are bad tessellation practices, specially when there's a company agenda behind it. As proven, vast amount of geometry is being tessellated that's not even in the camera view like the water for example. Also there are pieces of scenery with high triangle density that do not add much detail in comparison to the normal rendering.
Is that...is that a ball sack?
Have you never seen a woman from behind before?Is that...is that a ball sack?
Very nifty.
=( Now I have buyers remorse...
Time to repress it with Crysis 2
[images]
Just the kick I need to jump into Crysis 2, having recently finished Crysis and Crysis Warhead.
Edit: More Alice: Madness Returns :swoon:
Have you never seen a woman from behind before?
That's what im talking about, the game could be better optimized for both vendors, if not for trying to make the other company look bad. I think it was Hawks the other game that over tessellated models.You do understand that all this talk about Crysis 2 having bad tesselation is a direct result of one company's agenda behind it, right? The only thing that has been proven by wireframe shots is that there is no culling in wireframe mode. When you play the game in a normal mode geometry culling makes sure that stuff that isn't visible isn't rendered at all. As for the pieces of scenery with high triangle density -- even if it's not that visible sometimes it costs about 1 fps on a GTX560. Now re-read what I've posted earlier.
Just the kick I need to jump into Crysis 2, having recently finished Crysis and Crysis Warhead.
Edit: More Alice: Madness Returns :swoon:
How do you optimise a game for a hardware which is lacking in power? By removing graphical complexity from it? And that's a good thing because?.. I always prefer to have more graphical complexity even if my videocard can't or struggle to handle it right now -- because my next videocard will be fine with it but if that complexity isn't in the game at all then I'll never be able to use it. And that's why I'll never get these complaints from AMD on some game running bad on their hardware -- make better hardware god dammit. I was using GTX280 while AMD was making DX11 renderers for Dirt 2 and Battleforge which I couldn't use but it never even came into my head to say that AMD's making NV look bad (which was true) and thus it needs to stop adding DX11 to games because GFs can't handle it right now (which is complete nonsense). Now why wouldn't NVIDIA use the strengths of their GPUs to their benefit? That's free market, better product wins, end of story. And it's not like AMD's never done the same thing in the games that were sponsored by them -- they're pushing Eyefinity as hard as they can right now for example. Should they stop just because NV doesn't have it? I don't think so.That's what im talking about, the game could be better optimized for both vendors, if not for trying to make the other company look bad. I think it was Hawks the other game that over tessellated models.
The link to Crytek programmers saying that was posted on a previous page of this thread if I'm not mistaken. But that's just common sense really.I most admit i wasn't aware of the wire frame behavior in Crysis 2, where did you learn that dr_rus.
The bolded part is not a comparable example at all, one card was an API generation behind. The point is some assets of Crysis 2 were being over tesselated to the point of diminishing returns, a point where the user might not notice the difference, just because one of manufacturer has dedicated more silicon to the tesselation engine.How do you optimise a game for a hardware which is lacking in power? By removing graphical complexity from it? And that's a good thing because?.. I always prefer to have more graphical complexity even if my videocard can't or struggle to handle it right now -- because my next videocard will be fine with it but if that complexity isn't in the game at all then I'll never be able to use it. And that's why I'll never get these complaints from AMD on some game running bad on their hardware -- make better hardware god dammit. I was using GTX280 while AMD was making DX11 renderers for Dirt 2 and Battleforge which I couldn't use but it never even came into my head to say that AMD's making NV look bad (which was true) and thus it needs to stop adding DX11 to games because GFs can't handle it right now (which is complete nonsense). Now why wouldn't NVIDIA use the strengths of their GPUs to their benefit? That's free market, better product wins, end of story. And it's not like AMD's never done the same thing in the games that were sponsored by them -- they're pushing Eyefinity as hard as they can right now for example. Should they stop just because NV doesn't have it? I don't think so.
![]()
Sure did. Maybe -1.5 will do? But it's weird cause I'm using the recommended "no blur" custom bit. Also, does that behavior flag influence transparency supersampling?
A couple of things :
- Yeah AA behavior flags can and will affect things so set it to none
- AA - mode needs to be set to override.
- AA - setting needs to be set to 4xmsaa for 4xsgssaa
Ok, thx. Changed it to none. Isn't it generally better to use AA settings provided by game and not override? DA has a 1x-8x MSAA choice. The only thing I wondered when I was tweaking the Inspector profile was whether I should use "Enhance in-game AA" if I only add SGSSAA or leave it at "Application controlled".