Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

I'm interested to see how my GTX690 (2x2GB) copes with this. I don't mind running games below max settings. I just want to it to play smoothly and not look ugly.
 
In this post you mention the textures aren't on Ultra, no?

yes my point they are on HIGH and PS4 is higher, look at the rag to the right of the picture compared to High textures, on console it is better and equal to Ultra

I have both PS4 and PC copy and have looked at many textures, consoles run Ultra Textures and it makes perfect sense really???

EDIT: I can see many are still missing the point, PC HIGH texture = <PS4 textures, the only stage up is Ultra on PC.

if you want I can post some more comparisons from my PC @ Ultra and PS4?
 

you are actually incorrect. WD actually loads medium and ultra textures on consoles. It, then, blends the two together. it was a performance optimization for the console profile. it should also be noted it does this on stock pc version. at some point, an unofficial patch took out the console streaming optimization in pc which resulted in a substantial vram cutdown as well as fixing stuttering.
 
yes my point they are on HIGH and PS4 is higher, look at the rag to the right of the picture compared to High textures on console.

I have both PS4 and PC copy and have looked at many textures, consoles run Ultra Textures and it makes perfect sense really???

I'm not seeing it myself. Although it's pretty hard to make out any sort of fine detail with the amount of JPG compression in those pictures.

Edit: But yeah, this is wildly off-topic now (I actually misread and thought you were talking about the Shadows of Mordor ultra textures on PS4 originally).
 
It is enough. Witcher 2 will make most pc's cry on it's highest settings, doesn't mean that 3gb vram won't be able to appropriately cover all your games. Expecting all games, especially those tailored around pc, to run at everything max is unrealistic. No consumer PC out there can handle new games at 4k/60 fps yet, doesn't mean the card is garbage.

I'm not sure what you mean. My 880m msi laptop runs Witcher 2 just fine on ultra. What I'm talking about is this new shift to large vram pools, which most people argued wwouldn't happen until 2015. I'm not expecting my 780 3gig to run at 4k, I'm not asking it to. I'm asking that my card run a cross platform game on a PC with much beefier specs and to run it at the highest setting. It's not a crazy idea.
 
It's happening exactly what happened in the past gen, cards ended up needing at least the same amount of total ram of consoles. I can't know if we'll need 8GB this time or if we'll be fine with less, but: upgrading at the very beginning of a new console gen isn't reliable.
 
you are actually incorrect. WD actually loads medium and ultra textures on consoles. It, then, blends the two together. it was a performance optimization for the console profile. it should also be noted it does this on stock pc version. at some point, an unofficial patch took out the console streaming optimization in pc which resulted in a substantial vram cutdown as well as fixing stuttering.

Source?

As I have both copies and have played and compared and I see no difference in PC ultra and PS4.
 
Actually what they did is not more work, is less work.

Basically for every developer is a good straight forward way to work create texture without limitations and to scale down after that.

A good way to known what is the best size for a texture is to analyze how many texels of the texture will be visible in screen. But is a good way, not the best way. Best way is to join that info with how many seconds those texels will be visible in a complete run of the game. That info could be possible using automatic tools but I can't find a developer that take it account.

I can say a few examples using watch dogs. That game uses big textures for ad panels that are a few meters over the streets. You can't reach that panels so in a normal gameplay you are not seeing half the texels of that texture for a 1080p screen resolution. But, the game uses the same resolution for grass textures that are a used everywhere, you look at they frequently and on top of that grass textures are expanded for big zones of terrain so the ratio texels/ratio is horrible. Same for roads. A road texture used for 20 meters of terrain has same resolution than a newspaper decal used over the street on a corner.

Why that happens?

Doing a good balance between work in sizing textures is a lot of work. And moreso you need the memory requeriments before hand to start the calculation. That have to work over the premises I said before. A ordered list starting with most visible texture to the less visible taking accound how many texels will be necessary for them using a fixed screen resolution.

As difficult that task is, is easiest for a closed platform. Watch Dogs in PS4 use a clever mix of ultra/high/medium textures (not perfect mix btw). In pc they take the fast path: We have 1Kx1K textures for enviroment and 4Kx4K textures for characters. So ultra texture quality will be every texture in max resolution, high will be every texture at half and normal one quarter. So the fucking dirt leaves over the streets will be a blurry mess and we don't care the player will be looking that texture half of the game.

So, if the mordor game has a few 4K textures for important assets in console versions, I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.
Thanks for this.

Everybody read this please.
 
I just realized that it has been less than 90 Days since I bought my GTX 780....

Isn't there some kind of trade up program??!!

I've never looked into it before.
 
I didn't even know there was a GPU that had 6GB of VRAM. High is enough for me, I'm glad I went with the R9 280x that has 3GB instead of the R9 280 or GTX 770.
 
I'm going to wait for comparison shots before I run to the console versions in despair.
 
yes my point they are on HIGH and PS4 is higher, look at the rag to the right of the picture compared to High textures on console.

I have both PS4 and PC copy and have looked at many textures, consoles run Ultra Textures and it makes perfect sense really???

You do realize that the more vram requirement also improves mip mapping and over all texture lod.

i have both versions as well and there is a clear difference.


http://www.eurogamer.net/articles/digitalfoundry-2014-watch-dogs-face-off

go there and look and comparison with the balloons the far upper left you can see a very clear difference in the bricks.
 
Yeah this myth keeps going, IT IS NOT. Console Textures are Ultra.

nm87RL8.jpg
 
It's happening exactly what happened in the past gen, cards ended up needing at least the same amount of total ram of consoles. I can't know if we'll need 8GB this time or if we'll be fine with less, but: upgrading at the very beginning of a new console gen isn't reliable.

Consoled don't have 8GB for gpu
 
Why? Just lower the texture setting and play at your native res.. No one is forcing you to use these options.

It'll depend from game to game, though. I'm not really confident my GTX 660 with 1.5 gb will run things very smoothly at 1080p while still looking decent.
 
It'll depend from game to game, though. I'm not really confident my GTX 660 with 1.5 gb will run things very smoothly at 1080p while still looking decent.

Any texture detail you gain from a higher setting will be instantly lost from the shitty upscaling you get on PC.
 
I think its going to become the new extreme tessellation. They'll all leave textures uncompressed just because... this move is going to make all the 'ultra or upgrade' enthusiasts really groan. I have no problems choosing High settings, but I wouldn't ever want to select Medium anything with brand new hardware. We're all so stubborn.

If you just bought a brand new graphics card alone that costs more than a one or ps4, then I don't think you'd be wrong for expecting more than what pc gamers have been given with some of these games such as the watch dog port and possibly this port. There is nothing this game is doing graphically that would justify a total requirement of 14GB of Ram (8GB system ram + 6 GB of VRAM) to max out. If this was a Crysis situation, which it is not, then it'd be understandable and more than acceptable since Crysis looked far beyond what consoles were providing at the time and Crysis was a game that was optimized on PCs and actually put that additional power to use to produce something mind-blowing at the time. This, on the other hand, looks cross-gennish. What we're seeming to get with a lot of these games is highly optimized console versions with quick and dirty pc versions that let you get the extra features like higher res textures and framerates by brute forcing your way through it. Games like Battlefield 4 and Crysis 3 look better and they require nowhere near the resources this is asking for.
 
One thing I'm learning pretty quick. When I do get to upgrade my GPU, maybe in March with tax season, I dang sure will not be buying one with any less then 4GB of VRAM.
 
Skryim with most textures at 4k resolution doesn't even use 6gb of VRAM. Ultra in this game had better be 8k textures to justify that VRAM requirement. Unless they're loading every texture in the game at once? Which would be pretty silly.

not to pick on you in particular, but why are there so many comments in these threads by people with no technical knowledge explicitly or implicitly calling devs stupid and lazy because spec requirements are going up?

Typical forum egoism or just trying to defend a vcard purchase that no longer makes sense?

* * *
Crysis released back in the day with options intended to take advantage of future grfx specs. People went nuts, even to this day 'will it play crysis' is still a thing.

Durante had a thread stating that, as a dev, it is in his self interest to release a game with kneecapped ultra settings because otherwise he would be called lazy and stupid and everyone would think the game was terribly optimized instead of just future proofed.

It makes me think these devs should just not include these options and/or just use the crappier 360 version as the base for PC to prevent the anger and loss of sales.
 
I don't really understand your point, what does Watch Dogs have to do with Shadow of Mordor? "Ultra" isn't some kind of universal standard..

It was a comment about WD running only High Textures on console proving that PC does not need more than 2gb.

You do realize that the more vram requirement also improves mip mapping and over all texture lod.

i have both versions as well and there is a clear difference.


http://www.eurogamer.net/articles/digitalfoundry-2014-watch-dogs-face-off

go there and look and comparison with the balloons the far upper left you can see a very clear difference in the bricks.

Yeah cause DF never make mistakes, fact is I have the game, posted shots and looked at both games and PS4 has textures that look like Ultra with no visible difference on key areas. If you believe it or not is mute and we are off topic so last I will say on it, but this post explains it perfectly.

Actually what they did is not more work, is less work.

Basically for every developer is a good straight forward way to work create texture without limitations and to scale down after that.

A good way to known what is the best size for a texture is to analyze how many texels of the texture will be visible in screen. But is a good way, not the best way. Best way is to join that info with how many seconds those texels will be visible in a complete run of the game. That info could be possible using automatic tools but I can't find a developer that take it account.

I can say a few examples using watch dogs. That game uses big textures for ad panels that are a few meters over the streets. You can't reach that panels so in a normal gameplay you are not seeing half the texels of that texture for a 1080p screen resolution. But, the game uses the same resolution for grass textures that are a used everywhere, you look at they frequently and on top of that grass textures are expanded for big zones of terrain so the ratio texels/ratio is horrible. Same for roads. A road texture used for 20 meters of terrain has same resolution than a newspaper decal used over the street on a corner.

Why that happens?

Doing a good balance between work in sizing textures is a lot of work. And moreso you need the memory requeriments before hand to start the calculation. That have to work over the premises I said before. A ordered list starting with most visible texture to the less visible taking accound how many texels will be necessary for them using a fixed screen resolution.

As difficult that task is, is easiest for a closed platform. Watch Dogs in PS4 use a clever mix of ultra/high/medium textures (not perfect mix btw). In pc they take the fast path: We have 1Kx1K textures for enviroment and 4Kx4K textures for characters. So ultra texture quality will be every texture in max resolution, high will be every texture at half and normal one quarter. So the fucking dirt leaves over the streets will be a blurry mess and we don't care the player will be looking that texture half of the game.

So, if the mordor game has a few 4K textures for important assets in console versions, I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.



Great input to the discussion, have any thoughts yourself or just google .gif all day?
 
I'm not sure what you mean. My 880m msi laptop runs Witcher 2 just fine on ultra. What I'm talking about is this new shift to large vram pools, which most people argued wwouldn't happen until 2015. I'm not expecting my 780 3gig to run at 4k, I'm not asking it to. I'm asking that my card run a cross platform game on a PC with much beefier specs and to run it at the highest setting. It's not a crazy idea.

We're not seeing a large shift in VRAM pools for these games though, at least not at the low and mid end requirements, not yet. It's 1GB minimum, 2GB recommended for basically all the new console ports that have been coming out - Advanced Warfare is another one coming out soon that's similar. Shadow of Mordor is the same. What it has which is separating it from the pack is that they're giving us a texture add-on we can download that cranks the vram requirements substantially beyond the base game.

The only game that might indicate a shift thus far, I would argue, is The Evil Within, since it's saying "lol 4gb or go home", according to the news that came out today or yesterday or whatever. I'm skeptical, and we'll have to wait for benches on both games.
 
If this is the direction PC gaming is taking in regards to multiplats, then it can go F itself then, I'll stay with my Dota2-runs-fine specs.
 
We're not seeing a large shift in VRAM pools for these games though, at least not at the low and mid end requirements, not yet. It's 1GB minimum, 2GB recommended for basically all the new console ports that have been coming out - Advanced Warfare is another one coming out soon that's similar. Shadow of Mordor is the same. What it has which is separating it from the pack is that they're giving us a texture add-on we can download that cranks the vram requirements substantially beyond the base game.

The only game that might indicate a shift thus far, I would argue, is The Evil Within, since it's saying "lol 4gb or go home", according to the news that came out today or yesterday or whatever. I'm skeptical, and we'll have to wait for benches on both games.

I don't have the game so I can't comment but how much VRAM did Wolfenstein use exactly ? Gamegpu.ru claims 1.5 approximately.
Seems low.
http://gamegpu.ru/action-/-fps-/-tps/wolfenstein-the-new-order-test-gpu.html

Scroll down to the video memory section.
 
I'm not sure what you mean. My 880m msi laptop runs Witcher 2 just fine on ultra. What I'm talking about is this new shift to large vram pools, which most people argued wwouldn't happen until 2015. I'm not expecting my 780 3gig to run at 4k, I'm not asking it to. I'm asking that my card run a cross platform game on a PC with much beefier specs and to run it at the highest setting. It's not a crazy idea.

Yes it is a crazy idea. You don't get to decide what "highest" means and entails.
 
Consoled don't have 8GB for gpu
They neither have 6BG for the gpu, and yet this game is asking for 6GB.

(I'm not saying: you *will need* 6GB or 8GB. I'm saying that before having a consistent idea on what the requirements for "next gen" games are going to be, upgrading your video card can result disappointing).
 
Isn't there only 4.5 - 5GB available to devs on the PS4 and X1?
Yes, and devs can use as much as they can for graphics functions. So if they can manage it they will only use 1 gb for CPU functions and the rest for graphics or whatever combination they can manage. They don't have to worry about stepping over I to system functions because those are already preserved in the 3 gb footprint (which if we are lucky we will see decrease over time).
 
Yeah cause DF never make mistakes, fact is I have the game, posted shots and looked at both games and PS4 has textures that look like Ultra with no visible difference on key areas. If you believe it or not is mute and we are off topic so last I will say on it, but this post explains it perfectly.
That post does explain it perfectly and also confirms(he would know, too) that its a mixture of Ultra, High and Medium textures. So he's actually confirming you are wrong.
 
Did this thread really turn into a 15 page thread about people bitching about their old video cards becoming obsolete?

Come on PC guys, act like you've been there before.
 
Top Bottom