• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: NVIDIA GeForce RTX 4070 Graphics Card Specs, Performance, Price & Availability (300W + 36TF performance)

rofif

Can’t Git Gud
Games are not going to be designed around series s.

They're going to be cut down to work on series s.

Also, you want native 4k60 or even 120, you gotta go PC. PC also has much faster cpus already so not sure what your point with ue5 is.
Fundamentally games have to be DESIGNED to work on lowest common denominator.
Then other more powerful hardware will just run it faster and at higher resolution.
So that 100tf gpu will run xbox series s games at 4k120 and with few added effects
 
Fundamentally games have to be DESIGNED to work on lowest common denominator.
Then other more powerful hardware will just run it faster and at higher resolution.
So that 100tf gpu will run xbox series s games at 4k120 and with few added effects
They're not being made for Series S first. Series s is already getting sidelined.

This is totally different than the PS2 generation.
 
Last edited:

winjer

Gold Member
Devs won't target the Series S. They will target the PS5 and the Series X.
Then they'll just lower graphics and resolution until it kinda runs well enough on the Series S, and call it a day.
 

winjer

Gold Member
Hopefully.
winjer winjer yeah - ps5/xsx are the target.
Scalability to new graphics cards will be just natural

Between the PS5 and Series S, they'll probably sell around 200 million units. And they are very close in power, that it's just a matter of adjusting resolution a bit.
The series S, maybe can sell 20 millions by itself. No dev in his/her right frame of mind will target the least sold console.

On PC it will be as usual. The same game as consoles, but with higher frame rates, higher resolution for shadows, ray-tracing, fog volumes, textures, etc.
Maybe if it's a game sponsored by nvidia, AMD or Intel, it might use some extra effects or tech.
 
They're not being made for Series S first. Series s is already getting sidelined.

This is totally different than the PS2 generation.

Given that the S is outselling the X, that doesn't seem like a smart idea by devs. Sure, they will cut the resolution and the fps down to 30. But you can only cut so far.
 

rofif

Can’t Git Gud
Between the PS5 and Series S, they'll probably sell around 200 million units. And they are very close in power, that it's just a matter of adjusting resolution a bit.
The series S, maybe can sell 20 millions by itself. No dev in his/her right frame of mind will target the least sold console.

On PC it will be as usual. The same game as consoles, but with higher frame rates, higher resolution for shadows, ray-tracing, fog volumes, textures, etc.
Maybe if it's a game sponsored by nvidia, AMD or Intel, it might use some extra effects or tech.
The gap between consoles and pc is closing.
It really is down to resolution and framerate + extra ray tracing features.
Even during last ps4 gen, Textures quality and many features are exactly the same on pc.
This gen, the consoles started really strong compared to last time. SSD, direct IO, real CPUs and so on.
I expect pc ports to be exactly the same as console versions. Just res/fps/ray tracing high/ultra quality.
And the cutting on consoles is starting to be much easier thanks to image reconstruction. They will internally render 720p on series S if they have to :p
 
Given that the S is outselling the X, that doesn't seem like a smart idea by devs. Sure, they will cut the resolution and the fps down to 30. But you can only cut so far.
Developers don't want their vision constrained to that device. They would rather (and have already done so) run their games at 540p and 720p than to gimp xsx/ps5 versions.

Thing is, people who are buying series S are far less likely to care what their frame rate/graphics are.
 
Last edited:

Polygonal_Sprite

Gold Member
Games on PC are not meant to be just played at 1080p or lower, with low-medium graphics settings, no RT, at 30 fps.
Imagine real 4K, 120-360 fps, RT at high settings, and higher quality settings all around.
Lol obviously not but 100tf seems so much overkill even for current gen only games at 4k/Ultra/120fps.
 

The Cockatrice

Gold Member
I mean, I would jump from 2080 to 4070 but...what exactly would I play on it? There's no big games on the horizon for at least 2 years on PC that I would need such a beast especially now with DLSS/FSR 2.0
 

Barakov

Gold Member
NVIDIA GeForce RTX 4070 will be the next-generation high-end gaming graphics card, offering the latest graphics architecture based on Ada Lovelace GPUs. The graphics card will be replacing the RTX 3070, a very popular gaming graphics card in the $500-$600 US segment.

RTX 4070 series graphics cards will be designed around the $500 US segment which is a high-end price range that still offers lots of performance at hand. It's simple, the RTX 4090 series will be aimed at users who want the best of the best without worrying about the amount of money they are spending while the RTX 4080 series is aimed at users who want the best gaming performance at the best possible price. The RTX 4070 will be the sweet spot for high-end gaming, offering a buttery smooth 2K game experience.

The previous GeForce RTX 3070 was touted to offer a huge improvement over the RTX 2070 and was said to offer performance faster than the RTX 2080 Ti but ended up mostly on par with the Turing flagship with only the RTX 3070 Ti exceeding the performance of the previous Turing GPU flagship. It looks like the RTX 4070 will be placed in a similar position where it might offer graphics performance on par or close to the RTX 3080 Ti but a 'Ti' variant going further ahead in graphics performance.


NVIDIA's AD104 'Ada Lovelace' GPU - The Next-Gen Powerhouse​

Starting with the GPU configuration, the NVIDIA GeForce RTX 4070 series graphics cards are said to utilize the AD104 GPU core. The GPU is said to measure around 300mm2 and will utilize the TSMC 4N process node which is an optimized version of TSMC's 5nm (N5) node designed for the green team.

The NVIDIA Ada Lovelace AD104 GPU is expected to feature up to 5 GPC (Graphics Processing Clusters). This is the one less GPC than the GA104 GPU. Each GPU will consist of 6 TPCs and 2 SMs which is the same configuration as the existing chip. Each SM (Streaming Multiprocessor) will house four sub-cores which is also the same as the GA102 GPU. What's changed is the FP32 & the INT32 core configuration. Each sub-core will include 128 FP32 units but combined FP32+INT32 units will go up to 192. This is because the FP32 units don't share the same sub-core as the IN32 units. The 128 FP32 cores are separate from the 64 INT32 cores.

So in total, each sub-core will consist of 32 FP32 plus 16 INT32 units for a total of 48 units. Each SM will have a total of 128 FP32 units plus 64 INT32 units for a total of 192 units. And since there are a total of 60 SM units (12 per GPC), we are looking at 7,680 FP32 Units and 3,840 INT32 units for a total of 11,520 cores. Each SM will also include two Wrap Schedules (32 thread/CLK) for 64 wraps per SM. This is a 50% increase on the cores (FP32+INT32) and a 33% increase in Wraps/Threads vs the GA102 GPU.

NVIDIA AD103 'Ada Lovelace' Gaming GPU 'SM' Block Diagram (Image Credits: Kopite7kimi):

NVIDIA-Ada-Lovelace-GPU-Block-Diagram-For-GeForce-RTX-40-Series-Gaming-Graphics-Cards-low_res-scale-4_00x-1480x830.jpg



  • 5 GPCs vs 6 GPCs on GA104
  • +25% Cores vs GA104 GPU
  • 50% More L1 Cache (Versus Ampere GA104)
  • Twice More L2 Cache (Versus Ampere GA104)
  • +66% ROPs (Versus Ampere GA104)
  • 4th Gen Tensor & 3rd Gen RT Cores

    NVIDIA-Ada-Lovelace-AD104-GPU-Block-Diagram-768x843.jpeg

NVIDIA GeForce RTX 4070 Series Preliminary Specs:

Graphics Card NameNVIDIA GeForce RTX 4070 TiNVIDIA GeForce RTX 4070NVIDIA GeForce RTX 3070 TiNVIDIA GeForce RTX 3070
GPU NameAD104-400?AD104-300?Ampere GA104-400Ampere GA104-300
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~300mm2~300mm2395.2mm2395.2mm2
TransistorsTBDTBD17.4 Billion17.4 Billion
CUDA Cores~7680~704061445888
TMUs / ROPsTBD / 160TBD / 144192/ 96184 / 96
Tensor / RT CoresTBD / TBDTBD / TBD192/ 48184 / 46
Base ClockTBDTBD1575 MHz1500 MHz
Boost ClockTBDTBD1770 MHz1730 MHz
FP32 Compute~38 TFLOPs~36 TFLOPs22 TFLOPs20 TFLOPs
RT TFLOPsTBDTBD42 TFLOPs40 TFLOPs
Tensor-TOPsTBDTBD174 TOPs163 TOPs
Memory Capacity12 GB GDDR6X?12 GB GDDR68 GB GDDR6X8 GB GDDR6
Memory Bus192-bit192-bit256-bit256-bit
Memory Speed21 Gbps18 Gbps19 Gbps14 Gbps
Bandwidth504 GB/s432 GB/s608 Gbps448 Gbps
TGP~330W~300W290W220W
Price (MSRP / FE)$599 US?$499 US?$599 US$499 US
Launch (Availability)2022202210th June 202129th October 2020


Just for comparison's sake:

  • NVIDIA GeForce RTX 4090 Ti: ~103 TFLOPs (FP32) (Assuming 2.8 GHz clock)
  • NVIDIA GeForce RTX 4090: ~90 TFLOPs (FP32) (Assuming 2.8 GHz clock)
  • NVIDIA GeForce RTX 4080: ~50 TFLOPs (FP32) (Assuming 2.5 GHz clock)
  • NVIDIA GeForce RTX 3090 Ti: 40 TFLOPs (FP32) (1.86 GHz Boost clock)
  • NVIDIA GeForce RTX 4070 Ti: ~38 TFLOPs (FP32) (Assuming 2.5 GHz clock)
  • NVIDIA GeForce RTX 4070: ~36 TFLOPs (FP32) (Assuming 2.5 GHz clock)
  • NVIDIA GeForce RTX 3090: 36 TFLOPs (FP32) (1.69 GHz Boost clock)
  • NVIDIA GeForce RTX 3080: 30 TFLOPs (FP32) (1.71 GHz Boost clock)
  • NVIDIA GeForce RTX 3070 Ti: 22 TFLOPs (FP32) (1.77 GHz Boost clock)
  • NVIDIA GeForce RTX 3070: 20 TFLOPs (FP32) (1.72 GHz Boost clock)

Full article:https://wccftech.com/roundup/nvidia-geforce-rtx-4070/
200.gif
 

StreetsofBeige

Gold Member
I dont follow GPU pricing, but like everyone knows about all the scalping and low stock of 3000-class gpus.

I wonder if Nvidia will price the cards at a reasonable RSP like always and let scalpers and consumers battler it out for stock and ebay pricing. Or if Nvidia will get on the train and say fuck it, costs are hiked up a lot to stores and RSP is set at double what it should normally be.

You never know.

When it comes to new home and condo developments, builders jack up the prices big time the past 10 years because they know everyone is making too much money flipping it. So they got in on the game and literally tripled what a new home build costs vs way back.

If Jenson Wang (whatever the hell his name is) wants to perk up Nvidia profits, just zoom up the 4000 series prices.

(Edit: I didn't bother reading the OP and it looks like price is set. Given the drama last bunch of years, Nvidia leaving a lot of profits on the table. They could probably sell them for $1000 and it's be a sell out.)
 
Last edited:
I dont follow GPU pricing, but like everyone knows about all the scalping and low stock of 3000-class gpus.

I wonder if Nvidia will price the cards at a reasonable RSP like always and let scalpers and consumers battler it out for stock and ebay pricing. Or if Nvidia will get on the train and say fuck it, costs are hiked up a lot to stores and RSP is set at double what it should normally be.
If you dislike scalpers you better hope Nvidia increases the MSRP. In hindsight, the RTX 3xxx pricing was too low which led to scalping.
 

Bojji

Gold Member
They're not being made for Series S first. Series s is already getting sidelined.

This is totally different than the PS2 generation.

Games are made for Xbox One. But after cross gen they will be made for Series S.

Devs won't target the Series S. They will target the PS5 and the Series X.
Then they'll just lower graphics and resolution until it kinda runs well enough on the Series S, and call it a day.

That's not how game development works. Games will be made with Series S RAM pool in mind, they can't scale anything related to gameplay within that, only internal resolution and texture resolution. Series S version could be shit in the end but this should not fool people that it wasn't the target platform.

-10 GB of RAM is the target for devs once cross gen ends. With 7.5GB (or 8?) usable it's just ~50% (or slightly more) than what they had since 2013.
- CPU is the same so no problem here
- GPU is only part that really scales so no problem here (even in 720p LOL)
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If you dislike scalpers you better hope Nvidia increases the MSRP. In hindsight, the RTX 3xxx pricing was too low which led to scalping.
Scalping was due to shortages and a bunch of people buying for the crypto boom.
Without both of those scalping will be much less prevalent.
Theres no reason to buy on ebay if stores have stock.
$500 for a xx70 is already high in my opinion considering reference 1070s could be had for ~350....even the big Strix cards were still around 450.
Today with overclocking being pretty much useless reference cards with a decent cooler is your best bet, you gain very little going for an Aorus Master, Suprim X or Strix card.
 
Scalping was due to shortages and a bunch of people buying for the crypto boom.
Without both of those scalping will be much less prevalent.
Theres no reason to buy on ebay if stores have stock.
Yes but my post logically follows from this. You are correctly stating that the industry is struggling with a shortage and very high demand (due to crypto, but also probably increased demand due higher indoor activities during COVID). Given a constrained supply, you have to increase the price such that supply equals demand, else actors in the market will will create the equlibrium themselves by buying underpriced stock and selling it at its actual market value. Currently, the prices are deflating but they're still above MSRP by $100-300 if my cursory search was reliable. Nvidia's pricing will depend a lot on the trajectory of the market prices. If they keep falling, they might not charge more than for the 3xxx series. The 3xxx was definitely underpriced for the 2020-2022 period, that is undeniable.
 
Last edited:

winjer

Gold Member
Lol obviously not but 100tf seems so much overkill even for current gen only games at 4k/Ultra/120fps.

A few things to consider.
100TFLOps is a rumor.
Even if it's true, it's only for the biggest die. And even then, only for the chips that used the full die.
Remember that Ampere ditched the Int units in it's SM. If Ada Lovelace is similar, then these 100 TFLOPs will be the equivalent to 60-65 TFLOPs from Turing or RDNA.

That's not how game development works. Games will be made with Series S RAM pool in mind, they can't scale anything related to gameplay within that, only internal resolution and texture resolution. Series S version could be shit in the end but this should not fool people that it wasn't the target platform.

-10 GB of RAM is the target for devs once cross gen ends. With 7.5GB (or 8?) usable it's just ~50% (or slightly more) than what they had since 2013.
- CPU is the same so no problem here
- GPU is only part that really scales so no problem here (even in 720p LOL)

LOL, mate all they have to do is lower texture resolution, tweak Mipmap lod bias, reduce draw distance and be done with it. That will save all the memory they need.
No point in making a game for the weaker console, with the smaller install base.

Here is an example, from Unreal Engine, on how mip level affect memory usage.


2daJc7n.png
 
Last edited:

BigBooper

Member
Yes but my post logically follows from this. You are correctly stating that the industry is struggling with a shortage and very high demand (due to crypto, but also probably increased demand due higher indoor activities during COVID). Given a constrained supply, you have to increase the price such that supply equals demand, else actors in the market will will create the equlibrium themselves by buying underpriced stock and selling it at its actual market value. Currently, the prices are deflating but they're still above MSRP by $100-300 if my cursory search was reliable. Nvidia's pricing will depend a lot on the trajectory of the market prices. If they keep falling, they might not charge more than for the 3xxx series. The 3xxx was definitely underpriced for the 2020-2022 period, that is undeniable.
The people buying it at those prices mostly thought it would still be worth the high price because of mining. If mining wasn't a thing, the average scalper price wouldn't have been nearly as high.
 
The people buying it at those prices mostly thought it would still be worth the high price because of mining. If mining wasn't a thing, the average scalper price wouldn't have been nearly as high.
Sure, I didn't contest that. There are reasons for the high demand. The fact of the matter is if supply is constrained and demand too high, you raise the price or someone else will do it for you.
 

Bojji

Gold Member
LOL, mate all they have to do is lower texture resolution, tweak Mipmap lod bias, reduce draw distance and be done with it. That will save all the memory they need.
No point in making a game for the weaker console, with the smaller install base.

Have you seen any gen where games were made for more powerful console first? Only some Xbox games were made for this console first and then gutted in PS2 version (like Splinter Cell) but more than 90% of multiplatform games had PS2 as a target but this was the last console gen where consoles had massive memory differences. Since then main consoles were on par in memory amount, Series S fuck things up...

And XSS will likely have higher install base than XSX.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The people buying it at those prices mostly thought it would still be worth the high price because of mining. If mining wasn't a thing, the average scalper price wouldn't have been nearly as high.
Exactly people were expecting the prices to stay high because the crypto bubble wasnt going to burst.
But right now no one needs/wants GPUs for mining even with LHRv2 having been cracked, it would take forever to get ROI.
So unless Ada has like double the mining performance of Ampere im pretty sure demand will be much much lower for Ada than for Ampere.
Also considering a bunch of people now have 30xx GPUs for their work from home computers me thinks Ada wont have Ampere levels of demand.


P.S LHRv4 should be included in the Ada drivers.....so cracking LHR will start from scratch.
 

winjer

Gold Member
Have you seen any gen where games were made for more powerful console first? Only some Xbox games were made for this console first and then gutted in PS2 version (like Splinter Cell) but more than 90% of multiplatform games had PS2 as a target but this was the last console gen where consoles had massive memory differences. Since then main consoles were on par in memory amount, Series S fuck things up...

And XSS will likely have higher install base than XSX.

Most games are still crossgen, and made for the previous gen. But they are not made for the Series S.
Soon we'll see most devs developing only or the current gen, and then the target becomes the PS5 and Series X.
For the Series S, dev's will just lower resolution for screen, shadows, textures and other effects, until it runs well enough.
 
Pascal to turing was exactly that. That 2080ti could have been probably a ton faster if AMD was pushing and they sold it for insane prices like almost double that of 1080ti

Nvidia cares about being the top dog, and being the one that profits the most. The whole 4080 having less then 2x tflops over the 4090ti is basically just screaming to me, that amd simple doesn't have much going on for them with RDNA3, even that guy in my post stated RDNA3 was disappointing performance wise.

This could very well mean they made the 4080 to sit around the 3090/3090ti performance, and the 4070 around the 3080 performance, becuase there isn't much reason for them to push it forwards. There halo product the 4090ti will basically cull amd out of the top end again in benchmarks. Big chance AMD will also have lots of yield issue's or other problems with there new architecture.

So yea i can see why it would happen. The only reason the 3080 is so close to 3090 performance wise is because AMD was pushing a 6800xt so they had to counter it, if that card didn't exist u would have seen a 3070 being sold as 3080 and if the 6900xt didn't exist they would have called the 3080 probably there TI product and the 3090 wouldn't even exist.

But we will see tho.
This is true with selling the 3070 as 3080.Nvidia did it in the past.If you remember the famous and great 680 graphics card was planned as middle card but AMD had nothing to stand against Nvidia cards so Nvidia introduced the 680 as high end .But this time AMD is on fire they will do amazing the best CPUs and will be Head to head with Nvidia.Nvidia will only win if they have the biggest card.At same card size I think AMD will win in any metric minus maybe Raytracing
 

Bojji

Gold Member
Most games are still crossgen, and made for the previous gen. But they are not made for the Series S.
Soon we'll see most devs developing only or the current gen, and then the target becomes the PS5 and Series X.
For the Series S, dev's will just lower resolution for screen, shadows, textures and other effects, until it runs well enough.

Entire game design has to be made with this ~8GB RAM pool in mind, not much more than on X1/PS4.

They can cut textures and stuff but whatever they do then can't exceed this memory limit, every 9 gen game will be made with this ceiling in mind aside PS5 exclusives. That's why devs aren't happy with XSS existence and why many people say that this console is like an anchor for XSX and PS5.

I'm stopping this offtopic now :messenger_winking:

What kind of improvement we get from this compared to ps5 and xbox series x

Higher resolution, better performance and better ray tracing, usual stuff.
 
Last edited:

winjer

Gold Member
Entire game design has to be made with this ~8GB RAM pool in mind, not much more than on X1/PS4.

They can cut textures and stuff but whatever they do then can't exceed this memory limit, every 9 gen game will be made with this ceiling in mind aside PS5 exclusives. That's why devs aren't happy with XSS existence and why many people say that this console is like an anchor for XSX and PS5.

You really don't understand what can be done with mip levels, lower resolution for effects and assets, etc.
This is just a matter of changing some cvars on a game engine.
The issue that devs have is that the Series S forces additional work, for a small install base.
 

GymWolf

Gold Member
Not sure if a jumb from a 2070super to a 4070 is big enough, i'm probably gonna go with a 4080 if the price is around 700 euros.
 

Bojji

Gold Member
You really don't understand what can be done with mip levels, lower resolution for effects and assets, etc.
This is just a matter of changing some cvars on a game engine.
The issue that devs have is that the Series S forces additional work, for a small install base.

XSS will have quite high install base (maybe higher than SX).

Memory:

Billy Khan, lead engine programmer at id Software, says the Series S’ RAM is “a major issue”, and says that the “much lower amount of memory and the split memory banks with drastically slower speeds” will prove to be problematic. Similarly, Alex Gneiting, principal engine programmer at id Software, agrees with that sentiment, and says that the RAM deficiency won’t be easy to compensate, and will drag down the base specs that developers will have to consider noticeably for multiplatform games.

 

winjer

Gold Member
XSS will have quite high install base (maybe higher than SX).

Memory:




The PS5 + Series X are the main target, and will have a much larger install bae than the Series S.

The issue with the memory of the Series S has been overstated.
If you understood how changing a few cvars can reduce a game's memory footprint, you wouldn't be making such a fuss about the Series S memory.
Once again, take a look at the table for mipmap levels I posted from Unreal Engine to have an idea of how much memory can be saved from just tweaking mip lods.
 

GymWolf

Gold Member
The jump from a 5700xt/6600xt/2070S to a 3080 is more than double at 4k, so depending how a 4070 ends up relative to a 3080, it should be a big jump either way
Well, since a 3080 is not good for 4k60 with heavy\broken current games, there is no meaning in buying a gpu that perform roughly the same.

I have a 4k tv, it's really time for me to feed the tv with actual 4k content without decreasing framerate and details that much.
 
Last edited:

Allandor

Member
300W for a xx70 Series ... are they mad?
There is a limit what can be cooled by air without to much noise and so far that limit is at around 200W. 300W for a GPU is a no go for me. I run my 3070 at 65% power target because of the coolers noise. So it seems they are now maxing out the chips where they can, while also reducing memory bandwidth. Wrong direction nvidia ... wrong direction...
 
Last edited:
300W for a xx70 Series ... are they mad?
There is a limit what can be cooled by air without to much noise and so far that limit is at around 200W. 300W for a GPU is a no go for me. I run my 3070 at 65% power target because of the coolers noise. So it seems they are now maxing out the chips where they can, while also reducing memory bandwidth. Wrong direction nvidia ... wrong direction...

I have noticed similar thing. I had 250W GTX 1080ti (asus strix) in the past and this GPU was very loud during gaming even at low GPU usage. Now I have 200W GTX 1080 (palit gamerock premium) and I hear absolutely nothing even at 99% GPU usage. It seems like addition 50W above 200W makes a huge difference.
 

Dream-Knife

Banned
Have you seen any gen where games were made for more powerful console first? Only some Xbox games were made for this console first and then gutted in PS2 version (like Splinter Cell) but more than 90% of multiplatform games had PS2 as a target but this was the last console gen where consoles had massive memory differences. Since then main consoles were on par in memory amount, Series S fuck things up...

And XSS will likely have higher install base than XSX.
That era still had a ton of third party exclusives too. Xbox got most of the PC ports unmolested since it was a pentium 3 and ~mx440.

Big step up from the prior gen where PS1 got entirely different games from the N64.
This is true with selling the 3070 as 3080.Nvidia did it in the past.If you remember the famous and great 680 graphics card was planned as middle card but AMD had nothing to stand against Nvidia cards so Nvidia introduced the 680 as high end .But this time AMD is on fire they will do amazing the best CPUs and will be Head to head with Nvidia.Nvidia will only win if they have the biggest card.At same card size I think AMD will win in any metric minus maybe Raytracing
Big Navi was supposed to change the world too. RDNA2 were good cards. I would have been happy with mine had it not been for constant driver issues.
Well, since a 3080 is not good for 4k60 with heavy\broken current games, there is no meaning in buying a gpu that perform roughly the same.

I have a 4k tv, it's really time for me to feed the tv with actual 4k content without decreasing framerate and details that much.
That will only work until even more broken games come out. Really devs just need to get their shit together.
300W for a xx70 Series ... are they mad?
There is a limit what can be cooled by air without to much noise and so far that limit is at around 200W. 300W for a GPU is a no go for me. I run my 3070 at 65% power target because of the coolers noise. So it seems they are now maxing out the chips where they can, while also reducing memory bandwidth. Wrong direction nvidia ... wrong direction...
As you noted, that's maximum performance for those that want it. You're always free to degrade your performance.
 

Athreous

Member
For 2k gaming on ultra and 60+ fps, I wonder if a 4070 would do the job or if I should stick with a 4080 once they are released...
 

Kenpachii

Member
For 2k gaming on ultra and 60+ fps, I wonder if a 4070 would do the job or if I should stick with a 4080 once they are released...

Ultra settings u need the best u can get, ultra is always killing fps ridiculous hard. Hell i could see the next AC requiring a 4090 for 1080p ultra to 60 fps rofl.
 
Last edited:
Top Bottom