Yeah this myth keeps going, IT IS NOT. Console Textures are Ultra.
Yeah this myth keeps going, IT IS NOT. Console Textures are Ultra.
I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.
All Ultra aside Textures
In this post you mention the textures aren't on Ultra, no?
I guess high textures for me. My 290 4GB card is what Im sticking to all generation.
if you want I can post some more comparisons from my PC @ Ultra and PS4?
yes my point they are on HIGH and PS4 is higher, look at the rag to the right of the picture compared to High textures on console.
I have both PS4 and PC copy and have looked at many textures, consoles run Ultra Textures and it makes perfect sense really???
It is enough. Witcher 2 will make most pc's cry on it's highest settings, doesn't mean that 3gb vram won't be able to appropriately cover all your games. Expecting all games, especially those tailored around pc, to run at everything max is unrealistic. No consumer PC out there can handle new games at 4k/60 fps yet, doesn't mean the card is garbage.
you are actually incorrect. WD actually loads medium and ultra textures on consoles. It, then, blends the two together. it was a performance optimization for the console profile. it should also be noted it does this on stock pc version. at some point, an unofficial patch took out the console streaming optimization in pc which resulted in a substantial vram cutdown as well as fixing stuttering.
Good. Push it more. More! PC progress needs to wake from its 5 year nap. Push it til the smoke alarms scream.
Thanks for this.Actually what they did is not more work, is less work.
Basically for every developer is a good straight forward way to work create texture without limitations and to scale down after that.
A good way to known what is the best size for a texture is to analyze how many texels of the texture will be visible in screen. But is a good way, not the best way. Best way is to join that info with how many seconds those texels will be visible in a complete run of the game. That info could be possible using automatic tools but I can't find a developer that take it account.
I can say a few examples using watch dogs. That game uses big textures for ad panels that are a few meters over the streets. You can't reach that panels so in a normal gameplay you are not seeing half the texels of that texture for a 1080p screen resolution. But, the game uses the same resolution for grass textures that are a used everywhere, you look at they frequently and on top of that grass textures are expanded for big zones of terrain so the ratio texels/ratio is horrible. Same for roads. A road texture used for 20 meters of terrain has same resolution than a newspaper decal used over the street on a corner.
Why that happens?
Doing a good balance between work in sizing textures is a lot of work. And moreso you need the memory requeriments before hand to start the calculation. That have to work over the premises I said before. A ordered list starting with most visible texture to the less visible taking accound how many texels will be necessary for them using a fixed screen resolution.
As difficult that task is, is easiest for a closed platform. Watch Dogs in PS4 use a clever mix of ultra/high/medium textures (not perfect mix btw). In pc they take the fast path: We have 1Kx1K textures for enviroment and 4Kx4K textures for characters. So ultra texture quality will be every texture in max resolution, high will be every texture at half and normal one quarter. So the fucking dirt leaves over the streets will be a blurry mess and we don't care the player will be looking that texture half of the game.
So, if the mordor game has a few 4K textures for important assets in console versions, I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.
Source?
As I have both copies and have played and compared and I see no difference in PC ultra and PS4.
yes my point they are on HIGH and PS4 is higher, look at the rag to the right of the picture compared to High textures on console.
I have both PS4 and PC copy and have looked at many textures, consoles run Ultra Textures and it makes perfect sense really???
Yeah this myth keeps going, IT IS NOT. Console Textures are Ultra.
Guess I'll be running some games on 900p until I can afford an upgrade. :lol
It's happening exactly what happened in the past gen, cards ended up needing at least the same amount of total ram of consoles. I can't know if we'll need 8GB this time or if we'll be fine with less, but: upgrading at the very beginning of a new console gen isn't reliable.
I just realized that it has been less than 90 Days since I bought my GTX 780....
Isn't there some kind of trade up program??!!
I've never looked into it before.
Why? Just lower the texture setting and play at your native res.. No one is forcing you to use these options.
It'll depend from game to game, though. I'm not really confident my GTX 660 with 1.5 gb will run things very smoothly at 1080p while still looking decent.
Any texture detail you gain from a higher setting will be instantly lost from the shitty upscaling you get on PC.
I think its going to become the new extreme tessellation. They'll all leave textures uncompressed just because... this move is going to make all the 'ultra or upgrade' enthusiasts really groan. I have no problems choosing High settings, but I wouldn't ever want to select Medium anything with brand new hardware. We're all so stubborn.
Skryim with most textures at 4k resolution doesn't even use 6gb of VRAM. Ultra in this game had better be 8k textures to justify that VRAM requirement. Unless they're loading every texture in the game at once? Which would be pretty silly.
Source?
As I have both copies and have played and compared and I see no difference in PC ultra and PS4.
I don't really understand your point, what does Watch Dogs have to do with Shadow of Mordor? "Ultra" isn't some kind of universal standard..
You do realize that the more vram requirement also improves mip mapping and over all texture lod.
i have both versions as well and there is a clear difference.
http://www.eurogamer.net/articles/digitalfoundry-2014-watch-dogs-face-off
go there and look and comparison with the balloons the far upper left you can see a very clear difference in the bricks.
Actually what they did is not more work, is less work.
Basically for every developer is a good straight forward way to work create texture without limitations and to scale down after that.
A good way to known what is the best size for a texture is to analyze how many texels of the texture will be visible in screen. But is a good way, not the best way. Best way is to join that info with how many seconds those texels will be visible in a complete run of the game. That info could be possible using automatic tools but I can't find a developer that take it account.
I can say a few examples using watch dogs. That game uses big textures for ad panels that are a few meters over the streets. You can't reach that panels so in a normal gameplay you are not seeing half the texels of that texture for a 1080p screen resolution. But, the game uses the same resolution for grass textures that are a used everywhere, you look at they frequently and on top of that grass textures are expanded for big zones of terrain so the ratio texels/ratio is horrible. Same for roads. A road texture used for 20 meters of terrain has same resolution than a newspaper decal used over the street on a corner.
Why that happens?
Doing a good balance between work in sizing textures is a lot of work. And moreso you need the memory requeriments before hand to start the calculation. That have to work over the premises I said before. A ordered list starting with most visible texture to the less visible taking accound how many texels will be necessary for them using a fixed screen resolution.
As difficult that task is, is easiest for a closed platform. Watch Dogs in PS4 use a clever mix of ultra/high/medium textures (not perfect mix btw). In pc they take the fast path: We have 1Kx1K textures for enviroment and 4Kx4K textures for characters. So ultra texture quality will be every texture in max resolution, high will be every texture at half and normal one quarter. So the fucking dirt leaves over the streets will be a blurry mess and we don't care the player will be looking that texture half of the game.
So, if the mordor game has a few 4K textures for important assets in console versions, I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.
depends on the manufacturer; I know EVGA has one.
It was a comment about WD running only High Textures on console proving that PC does not need more than 2gb.
I'm not sure what you mean. My 880m msi laptop runs Witcher 2 just fine on ultra. What I'm talking about is this new shift to large vram pools, which most people argued wwouldn't happen until 2015. I'm not expecting my 780 3gig to run at 4k, I'm not asking it to. I'm asking that my card run a cross platform game on a PC with much beefier specs and to run it at the highest setting. It's not a crazy idea.
Oh my bad! I didn't follow the discussion all the way back, I assumed you were talking about Shadow of Mordor.
If this is the direction PC gaming is taking in regards to multiplats, then it can go F itself then, I'll stay with my Dota2-runs-fine specs.
We're not seeing a large shift in VRAM pools for these games though, at least not at the low and mid end requirements, not yet. It's 1GB minimum, 2GB recommended for basically all the new console ports that have been coming out - Advanced Warfare is another one coming out soon that's similar. Shadow of Mordor is the same. What it has which is separating it from the pack is that they're giving us a texture add-on we can download that cranks the vram requirements substantially beyond the base game.
The only game that might indicate a shift thus far, I would argue, is The Evil Within, since it's saying "lol 4gb or go home", according to the news that came out today or yesterday or whatever. I'm skeptical, and we'll have to wait for benches on both games.
I'm not sure what you mean. My 880m msi laptop runs Witcher 2 just fine on ultra. What I'm talking about is this new shift to large vram pools, which most people argued wwouldn't happen until 2015. I'm not expecting my 780 3gig to run at 4k, I'm not asking it to. I'm asking that my card run a cross platform game on a PC with much beefier specs and to run it at the highest setting. It's not a crazy idea.
They neither have 6BG for the gpu, and yet this game is asking for 6GB.Consoled don't have 8GB for gpu
Yes, and devs can use as much as they can for graphics functions. So if they can manage it they will only use 1 gb for CPU functions and the rest for graphics or whatever combination they can manage. They don't have to worry about stepping over I to system functions because those are already preserved in the 3 gb footprint (which if we are lucky we will see decrease over time).Isn't there only 4.5 - 5GB available to devs on the PS4 and X1?
That post does explain it perfectly and also confirms(he would know, too) that its a mixture of Ultra, High and Medium textures. So he's actually confirming you are wrong.Yeah cause DF never make mistakes, fact is I have the game, posted shots and looked at both games and PS4 has textures that look like Ultra with no visible difference on key areas. If you believe it or not is mute and we are off topic so last I will say on it, but this post explains it perfectly.
Because... PS4 version has all ultra textures....![]()
PS4 it is.