The problem is that none of the expensive cards actually provide the performance people want, so they're a waste of money. There are certain thresholds for cards.
* This card is fast enough to do X resolution at Y framerate
The problem with the $1,200 cards is that they're $500 more expensive but don't cross the next threshold. A 1080ti wasn't fast enough for 4k @ 144hz. A 2080ti isn't fast enough for 4k @ 144hz. Get over the hill and then maybe you can think about bending consumers over a tree stump, but don't just put some paperweight out there that doesn't even get over the next hurdle but costs a lot more. What's really accentuating this problem is that now that video games aren't actually programmed for PCs anymore but are just lazy console ports, it means none of the games are actually requiring these new expensive cards.
When Quake came out in 1996, you had to upgrade if you even wanted to play it acceptably. You needed a Pentium. If you wanted to play Quake 2 at 640x480+, you had to buy a 3DFX card. It wasn't a choice. Up until Microsoft paid off Western developers to basically abandon native PC development for the original Xbox, games were still actually made for bleeding edge PC hardware. They're not anymore. There's no ballistic game that requires a 2080ti to even run playably.
I can agree with lots of this, but I analyze everything on a case by case basis and I think Nvidia's tech is simply not worth it? At least AMD offers me 16Gb of HBM, crazy bandwidth better suited for 4k......At least they're putting their money where their mouth is, when presenting the Radeon 7 as a 4k card...…..Yet Nvidia wants to give me an 8Gb card with cheaper GDDR6 memory over HBM, with 448Gb/s bandwidth for $700 in comparison and ask me $100 more for the FE version at $800, for the privilege of OC'ing that bad boy? Yet, the FE only matches the boost clocks of the reference Radeon 7 when OC'd....So $800 for what exactly? For cheap cellphone RT hardware, which did in excess of 6 gigarays in 2016? Hybrid raytracing in one game, where some parts of the map are selectively raytraced with lots of rasterization still in place, a noisy image and a perf and rez executioner at that.....At least I know HBM is expensive, so for the amount they give and bandwidth it brings, I think it is well worth it in comparison...…..Radeon wins over NV with less stutter every time.......
All NV has done in this industry is cripple it, cripple price to performance.....Introduce lots of proprietary tools and software and of course hardware (gsync chips in monitors) and then go on to monopolize this industry...….Gameworks (setup to kill AMD performance), then people go on forums saying NV just crushes AMD in perf, when AMD cards have always been better on paper....More raw power, higher tflops etc....They pretend to be oblivious to the real issue and why AMD's perf has been behind NV all these years... Yet, things are changing, people are finally starting to see that power utilized when AMD has not been hamstrung by gameworks, dx11 etc......Vulkan and Dx12 has shown the power of AMD cards a bit better recently.....So you wonder, why does a game like Assassin creed perform so badly on AMD, why does FF.....Then you look at FC5 and it performs great on AMD cards, you look at the Division and it performs great there...You look at your Forza's, Dirt, Battlefields, Strange Brigade and you see the uplift over Nvidia's cards......And it's a good thing AMD is partnering with game companies on select titles, not to cripple NV cards but to extract the best performance they can get from theirs.....
If they did not do that, NV would just continue to riddle us with performance defeating features like PhysX, Hairworks, the entire gameworks portfolio "smokeworks" included, other features like Hbao+ and Vxao and the list goes on…….I'm all up for advancements in technology, but if you can't have those things enabled at playable framerates and at my current playing resolution, be gone with it, it's not ready for primetime......So NV calling me to spend hundreds of dollars and the features they boast and tout those cards on, everybody just turns it off for better perf and rez anyway, what's the point?...…...But No, buy this 2080, if it does not have enough power to enable those crippling performance features, buy the Ti at $1200, not enough power still? No problem, we have a $2499 RTX Titan.....They just hook you in the web, well some people at least, and make you spend exorbitant amounts of cash on the premise of great new technology, which most will turn off and they keep you spending more if you want to boast about GPU epeens online....
Yeah, no bones, NV marketing is something else, it works and most of their fans are the ones who do the most effective marketing for them, buying their cards, even those on 1050ti's pretending they're on 2080ti's, they come online even moreso to trash any AMD product that run circles around their low end NV cards, but they do not see the damage they're doing to the industry at large...….Yet everything is not forever, and enough persons have come forward to speak against the bewitching spell or rather curse NV has placed on this industry.....The 3.5 GB fiasco on the 970 was only the start of it, articles on gameworks was another, then that GPP thing really turn heads and caused a riot and then the lackluster Turing cards at such exorbitant prices was the last straw, it's no wonder Jensen is spooked....That lawsuit doesn't help either......
As for the PC situation......I too miss the old days of devs going to max out PC hardware....I miss Crytek, I miss Monolith of old (F.E.A.R)....I think FarCry (2004) from Ubisoft were all great accomplishments on PC as far as tech was concerned and it was the time you really took notice of PC hardware and what it could do, but right now it's just Nvidia implementing ridiculous features in games just to kill framerate and rez….Instead of devs actually maxing PC hardware smartly....At least with those games you saw the advancements, you saw the accomplishments and didn't mind upgrading to experience these titles....Hell, even I upgraded to a 8800GTX Ultra back then to experience Crysis and I still barely got a stable 30fps at 1280 x 1024....Yet, don't people remember how expensive those cards were, they were easily over $600...I think MSRP for the ultra was over $800.00 on debut...
Still, that type of PC development died, because people just pirated games to no end, Crytek and Ubisoft had huge issues with that in the early and mid 2000's....All the skidrow games etc, did a number on PC development...What's the point of putting all that effort into PC development if people just download a torrent.....Remember COD1 (2003), I played that on PC, along with United Offensive...That franchise only took off from COD2 on 360 and went bananas with Modern Warfare...So consoles offered something PC didn't, that's why the devs left......I still remember being wowed by Doom 2003, that first trailer blew everyone away. I also remember that Halflife 2/Source reveal, it was out of this world, the graphics and the physics combined...Those times with PC devs are gone, it's console devs wowing now with much inferior kit....I'm just happy next gen, consoles can finally have a decent CPU and GPU and it's thanks to AMD...Sorry NV, the last time NV gave XBOX a GPU, that ended in a fracas....PS3 had to cut cost by not going the double-cell route, they went to Nvidia, got a lousy GPU from Nvidia at a serious markup....Now AMD is feeding both consoles their technology and going deep into R&D to make powerful APU's and high performance systems in SFF's....Thank heavens for AMD, because if consoles had to depend on NV or even Intel, we'd be in a lot of trouble...…. A 4 core Pentium with a 1070 class processor for next gen perhaps...Pffftt!….at a serious markup too?
He isn't wrong. AMD isn't really bringing anything new to the table.
What new tech is Nvidia bringing....? You know that there are raytraced games or devs/companies utilizing that technology way before BFV or Nvidia, even on consoles. This hybrid reflection solution you see in BFV is not even impressive.....And I hope no one is saying DLSS is new, it's just upscaling through AI cores......The image is worse and if you just put your PC at native 1440p, you get much more performance......It's crazy, because when AMD/SONY started going the reconstruction route with CB, they never claimed you should go out and get $800.00 GPU's or consoles for such a feature.......Soon people will try to justify RTX cards, because it gives you 16xAF.....
Food for thought...
If you feel the need to talk about the competition then they actually are competition.
Succint, I even heard from a little bird that Jensen cashed in over 100,000 of his personal shares and racked in 18 million dollars when Nvidia stock skyrocketed earlier last year.....Just after he did the Todd Howard to investors btw.....