Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

Heh. Anything under 80 and I feel like I'm fine. Though I haven't seen my 980 get hotter than 72.

Ok cool, having only run my 670 at stocks but having zero issues and not really fully understanding the longevity affects of OC'ing GPU's (if any?) I'd like that trend of zero issues to continue :)
 
Just got my EVGA 970 ACX 1.0 - the temperature is great under load. Maxed out at 70 C on 99% GPU utilization on crysis 2. It crushes older games at 1440p so hard.

Noise is a non-issue - I live in a 19 bedroom frat house so ambient noise is always a "thing" no matter what the circumstance :)
 
Logically, no. Compression is for transport, to use an item it has to be decompressed.

The trick is to get a compression / decompression algorithm that doesn't negate the time saved by transporting smaller amounts of data.

But once it's on the other end it is decompressed. Analgoy is: emailing a zip file uses less bandwidth than emailing uncompressed files, but it doesn't change the space those files take up once you want to actually use them on the other end.

Ok, thanks for clearing that up. It makes sense that they only mention the memory bandwidth benefits then.
 
Logically, no. Compression is for transport, to use an item it has to be decompressed.

The trick is to get a compression / decompression algorithm that doesn't negate the time saved by transporting smaller amounts of data.

But once it's on the other end it is decompressed. Analgoy is: emailing a zip file uses less bandwidth than emailing uncompressed files, but it doesn't change the space those files take up once you want to actually use them on the other end.

I made this mistake before. You only have to decompress the texture file you're drawing at that moment. So you reference the compressed textures when creating the frame buffer, so it can absolutely reduce how much VRAM you need. Obviously the card and the game need to support the same compression.

That's why nothing is using the really good new compression coming in DX11.3 and DX12 yet, but these cards have support and when they get it, it will help a good deal on the specific titles that use it.

To use your zip file analogy, if every texture is zipped in it's own file, and we only need to reference a few textures at a time, if we can reduce the compressed texture file size we can absolutely reduce how much VRAM we are using. Once they're in VRAM they only need to be decompressed when we need them, and that's never going to be all the textures at once.

http://en.wikipedia.org/wiki/Texture_compression

Because their data access patterns are well-defined, texture decompression may be executed on-the-fly during rendering as part of the overall graphics pipeline, reducing overall bandwidth and storage needs throughout the graphics system.
 
I made this mistake before.

Ah I see; thanks.

But the card has to store the compressed textures separately, then decompress the "needed" ones for the frame buffer on the fly..

Does that really decrease the amount of space taken up? Wouldn't it be on a game by game basis?

Load 3GB of textures.. compression makes it 2... if you need 1GB of textures or lower you save VRAM.. need more than 1GB you end up using more?

I suppose in general you'd be caching far more than you are using in the rendering pipeline though.
 
Based off the GPU-Z screenshots of the 970 and the 7970, they look the same. Is there really that much better performance? If so, why?

They're really not. A 970 is a ~30% performance jump, more when OC'd.

Probably not. I think when the 20nm or 16nm cards come out, you should consider upgrading everything. By that point Intel's Skylake CPUs will be out (14nm), not mention the DDR4 RAM upgrade.

DDR4 isn't going to make much of a difference.

Skylake won't be out by that stage either.
 
I don't even have the money right now, though I could probably make it work, I'm just going to wait for the 8GB version.
 
Ok cool, having only run my 670 at stocks but having zero issues and not really fully understanding the longevity affects of OC'ing GPU's (if any?) I'd like that trend of zero issues to continue :)

As far as I know, the only possible side-effect is increased heat. But if you can overclock and still stay in the temperature safe-zone, you'll be fine.
 
Ah I see; thanks.

But the card has to store the compressed textures separately, then decompress the "needed" ones for the frame buffer on the fly..

Does that really decrease the amount of space taken up? Wouldn't it be on a game by game basis?

Load 3GB of textures.. compression makes it 2... if you need 1GB of textures or lower you save VRAM.. need more than 1GB you end up using more?

I suppose in general you'd be caching far more than you are using in the rendering pipeline though.

it decompresses the textures on the fly and writes what it needs to the frame as it goes along drawing pixel by pixel. once we're done with a texture (even if we'll need it again in the next frame) we're not still holding that decompressed texture in memory. you're not ever going to have anything even approaching 1 GB of decompressed textures in memory at a time.

You can call a specific part of a compressed texture without having to decompress the whole thing. It's pretty clever technology.

Right now, textures are pretty much all compressed. Better compression (smaller with no loss in IQ) would free up more memory.
 
They're really not. A 970 is a ~30% performance jump, more when OC'd.



DDR4 isn't going to make much of a difference.

Skylake won't be out by that stage either.

Skylake is coming out Q2-Q3 2015 IIRC. Around the same time as the new cards.

DDR4 is a bonus.
 
Ah I see; thanks.

But the card has to store the compressed textures separately, then decompress the "needed" ones for the frame buffer on the fly..

Does that really decrease the amount of space taken up? Wouldn't it be on a game by game basis?

Load 3GB of textures.. compression makes it 2... if you need 1GB of textures or lower you save VRAM.. need more than 1GB you end up using more?

I suppose in general you'd be caching far more than you are using in the rendering pipeline though.

When you pull the texture from VRAM to the rendering pipeline it gets inserted into the L1 texture cache to be worked on by the TMU pipe. It's at that point that the texture is decompressed since each TMU's L1 texture cache has the logic to decompress compressed texture formats. Since this logic is duplicated dozens or even over a hundred times (depending on the number of TMUs) it needs to be simple to minimize the number of transistors the L1 texture cache takes on the die. This is why new texture formats are introduced sparingly. You need to have a giant boost in performance to justify the extra transistors which are then going to be replicated X number of times where X can be anywhere from four to hundreds.
 
We don't have any confirmation of 8gig versions of these cards yet? Not even a concrete rumor?

That's one hell of an oxymoron.
To answer your question, I've seen a bunch of rumors and supposed leaks about the 8 gigs versions, but most of it was before the cards were even announced I think. Wait and see I guess.
 
That's one hell of an oxymoron.
To answer your question, I've seen a bunch of rumors and supposed leaks about the 8 gigs versions, but most of it was before the cards where even announced I think. Wait and see I guess.

By concrete rumor, I mean a reliable source. There's rumors then there's rumors that are really leaks. Perhaps I should have stated it better.
 
The 4GB cards only use 8 chips and have PCB layouts there already for 16 in clamshell mode or even possibly (wild speculation?) a 512-bit 8GB GM210 Ti variant.
 
I remember reading an article a few days ago saying for the first time that Intel will be releasing 2 architectures next year (Broadwell and Skylake).

Seems so! Hadn't seen that. If Skylake is going to be ready in time they might as well skip Broadwell, no one is going to buy it if they can help it.
 
The 4GB cards only use 8 chips and have PCB layouts there already for 16 in clamshell mode or even possibly (wild speculation?) a 512-bit 8GB GM210 Ti variant.

Not all of them, but yes, when you see a GTX970 like this one

83xXY0Q.jpg


I don't see how a 8 GB card is not already in the works.
 
I remember reading an article a few days ago saying for the first time that Intel will be releasing 2 architectures next year (Broadwell and Skylake).

There's a different variety of parts. It's not one entire architecture is released on the same day and we're done for a year and a half. Broadwell Y parts are on the ground ready to go into ultra slim fanless form factors this holiday season. We still haven't seen U, H or K parts yet. I suspect we'll see the same with Skylake next year.
 
Just finished playing The vanishing of Ethan Carter. Was getting 45-60fps at 2560x1440 downsampled to 1080p (thanks DSR), with 4xMSAA and all settings on max. Looks beautiful. Stock MSI twin Frozr 970
 
Just finished playing The vanishing of Ethan Carter. Was getting 45-60fps at 2560x1440 downsampled to 1080p (thanks DSR), with 4xMSAA and all settings on max. Looks beautiful. Stock MSI twin Frozr 970

If you can play in 3D, play in 3D.

I've got a few tweaks to try out when I get in which will hopefully cure the stuttering I was getting and get the thing using SLI. Even stuttering (and with a few glitches... the reflections on the water don't quite look right in 3D) the game looked beautiful. Strangely real. That photometry stuff really works.

Didn't it come out today? How short is the game?

yesterday. 3-5 hours long. $20.

and its beautiful.
 
Just got an MSI 970 and downloaded Firestrike for the first time. I'm getting a black screen after the first test. Can still move cursor but can't alt-tab to desktop or bring up task manager forcing me to do a hard reset of system. Reset entire bios to default settings. i can't swap out the gpu with another as it's my only dx11 card. No idea why this is happening. PlanetSide 2 and World of Wacraft run absolutely fine.
 
Just got an MSI 970 and downloaded Firestrike for the first time. I'm getting a black screen after the first test. Can still move cursor but can't alt-tab to desktop or bring up task manager forcing me to do a hard reset of system. Reset entire bios to default settings. i can't swap out the gpu with another as it's my only dx11 card. No idea why this is happening. PlanetSide 2 and World of Wacraft run absolutely fine.

updated directX ?
 
If a brother can get an 8GB GTX 970 for $400-$430, it might be worth it, but anything higher than that and I'll just stick to my High setting and call it quits.
 
If you can play in 3D, play in 3D.

I've got a few tweaks to try out when I get in which will hopefully cure the stuttering I was getting and get the thing using SLI. Even stuttering (and with a few glitches... the reflections on the water don't quite look right in 3D) the game looked beautiful. Strangely real. That photometry stuff really works.



yesterday. 3-5 hours long. $20.

and its beautiful.


Does it have proper 3D support or only via a 3rd party driver? I could hook it up to my 3DTV via HDMI if nvidia supports that?
 
Does it have proper 3D support or only via a 3rd party driver? I could hook it up to my 3DTV via HDMI if nvidia supports that?

To do that you need a piece of Nvidia software called 3DTV Play. You can get it from NVidia.com for $40. Pretty cheap if you ask me. Then you just enable it in your drivers, and that's pretty much it.

There's a good number of games with support and even more that have had mods made to get them up to scratch.

http://helixmod.blogspot.com/ covers the mods. A lot of good stuff there.

I tried it out with Ethan Carter yesterday expecting it to be terrible, but the devs have actually been putting in effort to have a good 3D experience in the game ahead of launch, so that's great. Still needs a tweak or two to be perfect, but yeah. Really really pretty.

edit: Looks like they've fixed the broken reflections in a mod already. No messing around! Major props to the Helixmod guys.
 
Top Bottom