MightySquirrel
Banned
Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
Who the heck was so drunk to tell you that?
especially for gaming c2 just doesn't compare
Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
I don’t see why not. If you never game at 4k, then you can’t really miss it.Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
I play everything in 1440 on 65” tv. It’s fine.Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
I have a 3080 and I am happy with it. seeing no next-gen game only game coming this year, buying any card really higher than 3080 is pointless. once next-gen only games are out that will use the GPU power needed, I will probably upgrade then. I need a card that will run unreal engine 5 games at ultra-wide 2k with 120 frames everything ultra. and from the look of it, not even the new-gen cards will do such a thing. (if the matrix demo is anything to go by)
The main game I actually play the most is Call of duty. and MW2 is a cross-gen game. meaning even the 3070 is more than fine for 2k gaming.
Software isn't really keeping up with the hardware advancement sadly. we are 2 years behind when it comes to software to hardware ratio.
If I had a 3080 I would absolutely be holding off until the 5000 series, with there being no software upcoming to take advantage of these upcoming cards.Pretty much what Devil says. Got a 3080 can' t get excited about these cards when theres nothing mind-blowing to run on it. Can't see me upgrading until the year after at the earliest simply no need.
Although will be upgrading my laptop once these 240hz oleds drop, but thats a different thread.
it’s fine, I play on a 82” and with DLSS 1440p is more than fine. without it, it’s a bit soft but still playable, good TV’s are way better with upscaling than monitors.Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
I'm blindly positive the pro consoles won't touch the base 3080 let alone the ti.My 3080tinis enough till at least pro consoles come out...I'll upgrade after that if only pro consoles can match or go beyond that
It's not working like that.then buy that… the 4060 will likely fall into that range.
I also am excited for when we can make that thead!Pretty much what Devil says. Got a 3080 can' t get excited about these cards when theres nothing mind-blowing to run on it. Can't see me upgrading until the year after at the earliest simply no need.
Although will be upgrading my laptop once these 240hz oleds drop, but thats a different thread.
Not disagreeing with how insane the power requirements are getting but the 4060 should be really good still, maybe as good as 3080.It's not working like that.
You can clearly see that the gpu market power draw is shifting to around 250-350w.
Previously, high end video cards were less power hungry.
1070ti was 150w and 1080 was 180w.... and it was already a lot.
You cannot reasonably ask someone to give up their hobby by capping to a low end graphics card. You should be asking the market leaders to innovate nd really shrink the dies rather than just expanding.
It's not a challange to make 3000w gpu. It's just a matter of die size and materials
3070ti to 3080 is 20%.L
This doesn't make any sense.
A 3070 ti is 5% slower than a 3080 so u think a 4070 new gen will be on par with a 3080? Come on!
4070 will be on par with a 3090 ti, probably a bit faster 5%
4080 will be 20-30% faster than a 3090 ti
3090 ti is 40 tflops gpu. 4080 is 50. So just for that will be probably 20% faster besides there is the new arch features.
No it is not.3070ti to 3080 is 20%.
Yeah waiting on the mobile side myself, coming from a current 2080 maxq and a 60hz oled, a 4080mobile and a 240hz oled should be a decent bump next year.I'm interested in the laptop GPUs below 100w and, hopefully, the same price as current gen mobile mid-range.
You can do eeet.I know I should wait for the next round of cards but it’s hard to wait when I’ve already waited 5 years
The very same techup does show 20% : https://www.techpowerup.com/gpu-specs/geforce-rtx-3070-ti.c3675
I really doubt there will be that huge difference in TF between RTX 4080 and 4090(50tf vs 90tf).
I also think, even though RTX 4080 will be beast, its still better choice to have 1440p/144hz monitor than 4k/60hz..
no thanks, I 100% disagree with you.It's not working like that.
You can clearly see that the gpu market power draw is shifting to around 250-350w.
Previously, high end video cards were less power hungry.
1070ti was 150w and 1080 was 180w.... and it was already a lot.
You cannot reasonably ask someone to give up their hobby by capping to a low end graphics card. You should be asking the market leaders to innovate nd really shrink the dies rather than just expanding.
It's not a challange to make 3000w gpu. It's just a matter of die size and materials
If your coming from a 1070/1080 no reason to wait, or get a used 3080 for less when these drop( thats what i would do)I know I should wait for the next round of cards but it’s hard to wait when I’ve already waited 5 years
Ok guys keep dreaming a 4070 will be a 3080 ti perf.
What does that have to do with 3080 being 20% above 3070ti?Ok guys keep dreaming a 4070 will be a 3080 ti perf.
I know people paid a lot for current gen but perf will be there with these new cards and avaliability will be better.
1440 on a 4k screen is fine and the only way you would notice a difference is if a 4k display is sitting right next to you running native 4k content.Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
Well if the difference between 4080 and 4090 is about 25%-30%in games and price being around 700$ for 4080 and 1000$ for 4090 i can see many people opt for 4090, including myself. But i think price will be around 900$/1400$.The 80ti and 90s are gonna be AD102s so its back to old days of 102s being massively more performant than the xx80s.
The 1080ti was a huge upgrade over the 1080.
Ampere is where things got weird with the xx80 being a 102 chip.
So yeah I totally expect the 80 ti, 90 and 90ti to be a huge upgrade over the xx80.
For people with 3080s we basically have to wait for the 80ti or buy a 90.
What are you disagreeing with? The facts?no thanks, I 100% disagree with you.
we have game consoles and lowered powered GPU’s for people who want a balanced power/thermal/performance ratio, but I love that the market has said “you choose what you want, you want to water cool, push the limits and get a 390watt car with 115% power limit over ride? here you go”. I want them to push the limits, and price it high as an enthusiast option. people who care about power draw shouldn’t be buying a high end enthusiast gpu, just like people who care about MPG shouldn’t be buying a Lamborghini.
i’m disagreeing with the idea that the current trend is bad. you can still 100% build a low powered build that’s very powerful, i’ve build several SFF builds that could run great in a very small power profile. but now we have the option to go crazy and build 800+ watt beast of machines, I love it.What are you disagreeing with? The facts?
And putting consoles in comparison only serves to show how PAINFULLY POWER HUNGRY the gpus are.
PS5 takes 200 watts at most.
rtx 2080 is alone around 250watt... and where is cpu, motherboard, ssd, fans ?
What are you disagreeing with? The facts?
And putting consoles in comparison only serves to show how PAINFULLY POWER HUNGRY the gpus are.
PS5 takes 200 watts at most.
rtx 2080 is alone around 250watt... and where is cpu, motherboard, ssd, fans ?
you are still comparing gpu alone to whole system.Comparing current gen consoles to a 2080 is not the best comparison.
For one, the 2080 is made in the 12nm process node. Which was just an improvement on the 16nm node.
Then the RTX2080 has more hardware in it. Like better RT units and full tensor units. Resulting in a bigger chip.
If we are goin to compare a PS5 to a PC GPU it would be closer to a 6600XT. And in that case, the difference is not that big.
For power consumption, the 6600XT does have an advantage by having just a 128bit bus. While the PS5 has a 256 bit bus that is more power hungry.
This is the real power usage of a 6600XT, removing other components.
you are still comparing gpu alone to whole system.
And yes. ps5 does compare to 2070ti - 2080
Wuuut ? You expect a new gen of cards to be the same thing as the old one ? A 3090 is 10% faster at 1080p and 15% at 4k than a 3080. What you posted here has never happened since graphics cards were invented on this planet. Not once, in the entire history of dedicated graphics cards.
What’s the point in 100+ teraflop GPU’s when AAA games will be designed around a 4tflop Series S and no one’s going to build a AAA PC exclusive game around it.
Isn’t Unreal Engine 5 Matrix demo also CPU bound rather than GPU?
Games are not going to be designed around series s.What’s the point in 100+ teraflop GPU’s when AAA games will be designed around a 4tflop Series S and no one’s going to build a AAA PC exclusive game around it.
Isn’t Unreal Engine 5 Matrix demo also CPU bound rather than GPU?
The issue isn’t the cards performance.. it’s that it has a broken fan header. It’s a 2080 but fan2 goes from idle to 400% there is no In between.You can do eeet.
What card do you currently have?
Still running a 980ti myself.I know I should wait for the next round of cards but it’s hard to wait when I’ve already waited 5 years
The 70ti was a scam from the get go at msrp......for 1100 dollars oooooooofffffffffIf you wanna know what pain feels like:
I bought a 3070 ti for around 1100$/€ back in 2021....