If it was I would tell you. You should check out professional reviews where the 3080 Tier is better than a Physical Series X in terms of Input Latency on some games but not a local PC but the change is so Minimal. Tom's Hardware also did an indepth review and so did Digital Foundry. Turn on 120 Hz with games that support Nvidia Reflex and see the magic. Unless you're a full on esports gamer you literally won't see the ms difference so why even pretend? LDAT tests also prove its proficiency.
Sounds like he ruffled your feathers. Nvidias pricing is absurd these days but their tech is years ahead of the competition..
Look at DLSS2 release date and then look at FSR 2 release date.Their tech isn’t ‘years ahead’ of AMD, and they certainly didn’t have anything better for consumers at console price points back in 2020.
Look at DLSS2 release date and then look at FSR 2 release date.
Look at CUDA and look at no CUDA
Look at DLSS3 release and look at FSR 3 release date
Look at nvidia drivers history and look at amd drivers history.
While amds tech is great at rasterisation, they are years behind in features.
Most*people who wants to buy console cant afford these ridiculous PC hardware anyway LOL
Nvidia 3080 Tier is $99 every 6 months. Try it, you don't need expensive hardware anymore, just a good display and a bomb ass internet connection with 25 MB minimum. Don't listen to the naysayers, try it yourselves.
A good TV will last a decade and a good set of speakers will last two or three decades.Most*
A good number of them spend more then that on audio and TV equipment.
Shit some have speakers that cost more then that.
Or they could avoid making the retarded comparison between their 2023 GPUs on $3000 laptops and $500 consoles from 2020.Maybe they should come on stage, bow deeply, apologize for not being Sony and beg for forgiveness. Show true honour like members of the Samurai class.
Never said they were the first or the only ones in anything, just that they have been earlier than amd and that their results are better as i get better visuals and slightly higher framerates with dlss compared to fsr adn xess which are still better than taau gen5. Theres also a lot more and better games with dlss2+ than with fsr2. The AMD blocking DLSS from their sponsored titles conspiracy that some people have came up with is hilarious. I want to know if its true or not as either result would be funny as hell.DLSS was released before FSR2. But DLSS was not the first Temporal upscaler.
For example, Epic's TAAU was introduced in UE4 in 2018. 2 years before DLSS 2.0
NVidia does get the credit for being the first to use ML, to smooth out the end result.
Again, never said they were the first or the only ones in anything just better. I use topaz labs a lot and it takes half a day to interpolate 1h 30fps to 60fps video there and the results are not as good as dlss3 which is done in real-time (although i can run 3 different insances which cuts the time down a lot). TV interpolation is a joke everyone agrees on. AMD have not released fsr3 so i have nothing to compare it to which reinforces the argument of them "being behind".NVidia was not the first to use interpolation to generate new frames. TVs have been doing it for many years before.
Like DLSSS, they are the first to use ML to try to improve the final result.
I believe they are in very usable state compared to early rdna1 times, but they are still not as good as nvidia. Close enough though. Amd have been pretty bad at older APIs for a long time.The issues with AMD drivers have been gravely overblow by NVidia fanboys.
As someone who has really used GPUs from both brands, I can say they are not that far apart.
And in recent times AMD has improved them to the point where they are as good, if not better than NVidia.
As someone who recently got into AMD, after almost a decade with only NVida GPUs, I was surprise at how much better the Control Panel on Radeon is, compared to NVidia.
Nvidia's control panel is like something pulled from the 90's.
I know, love my freesync monitor. Never had a g-sync module one which were apparently supposed to perform a bit better, but that is very model specific i guess and i wouldnt be surprised if the newer freesync ones are better than old g-sync module ones. They are not worth the price imo and majority thankfully uses freesync if we dont count TVs which i dont really care about.AMD created freesync, based off the VESA standard, at a time when NVidia was pushing a 150$ G-sync module. Today G-Sync is dead and almost all monitors and TVs use AMD´s Freesync. NVidia does get the credit of being able to hijack the marketing of Freesync and calling it G-sync-compatible.
The CPU scheduler overhead is a big deal to me since i end up cpu bottlenecked in a lot of games even with alder lake, it is not talked about enough. I dont know if they adressed this in the last year because its been quite a while since i followed this driver scheduling topic.AMD still has the better fron-end in a GPU, being able to have a lower overhead in drivers. NVidia is still doing a lot of stuff in software with their drivers.
he's cringe cause he wears a leather jacket? ok lolIf you want to see cringe, this is Nvidia's leader. The guy actually think's he's Fonzie.
if only there was proper competition to keep Nvidia in check.Nvidia can fuck right off after their stupid ass pricing for all the 40 series cards.
You are a Sony plebs?XBOX rats or Nvidia rats... makes no difference to me, you're all vermin!
They had to because the only displays that included the parts of the vesa standard to do vrr were in laptops for power saving features. It's why AMD demoed it on laptops for over a year before a desktop monitor come to market with the functionality. It was also worse for a long time with bad operating frequency windows with shit like 50-90hz ranges when every gsync display was doing 30-max rafresh and only worked over hdmi when display port was an important consideration at the time. Yes it's fine now many years later but this is not one of nvidias typical we did it to fuck other companies moves, there were some valid reasons for the module.I know, love my freesync monitor. Never had a g-sync module one which were apparently supposed to perform a bit better, but that is very model specific i guess and i wouldnt be surprised if the newer freesync ones are better than old g-sync module ones. They are not worth the price imo and majority thankfully uses freesync if we dont count TVs which i dont really care about.
Never said they were the first or the only ones in anything, just that they have been earlier than amd and that their results are better as i get better visuals and slightly higher framerates with dlss compared to fsr adn xess which are still better than taau gen5. Theres also a lot more and better games with dlss2+ than with fsr2. The AMD blocking DLSS from their sponsored titles conspiracy that some people have came up with is hilarious. I want to know if its true or not as either result would be funny as hell.
Again, never said they were the first or the only ones in anything just better. I use topaz labs a lot and it takes half a day to interpolate 1h 30fps to 60fps video there and the results are not as good as dlss3 which is done in real-time (although i can run 3 different insances which cuts the time down a lot). TV interpolation is a joke everyone agrees on. AMD have not released fsr3 so i have nothing to compare it to which reinforces the argument of them "being behind".
I believe they are in very usable state compared to early rdna1 times, but they are still not as good as nvidia. Close enough though. Amd have been pretty bad at older APIs for a long time.
I hate Nvidia control panel. Its god awful, you are 100% right. Adrenaline is nicer, but i like the customization of nvidia profile inspector. Ultimately doesnt matter that much since i rarely use them and when i do they both work which is their whole point.
Not driver related, but did you see what is happening with 7900xt(x) currently?
The CPU scheduler overhead is a big deal to me since i end up cpu bottlenecked in a lot of games even with alder lake, it is not talked about enough. I dont know if they adressed this in the last year because its been quite a while since i followed this driver scheduling topic.
You forgot AMDs SAM/REBAR which is cool.
I think both companies are very scummy nowadays. Nvidia way more, although amd is trying to catch up after years of pretending of being a nice pro consumer company, but their producs are better.
Nvidia straight up making misleading console war statements is lame and should be pointed out. They are using their bullshit tflops metrics to state that their 60 tflops 4080 is 5x faster than the xsx which is true when counting tflops but the performance is roughly 2x-2.5x. Maybe a little more in ray traced games, but we know that the XSX performs like a 2080 in most games. Especially in UE5 demos like the Matrix.Sounds like he ruffled your feathers. Nvidias pricing is absurd these days but their tech is years ahead of the competition..
Those numbers are either 1080p or completely wrong. The 4090 is actually quite a bit more than 3x faster than the XSX.Nvidia straight up making misleading console war statements is lame and should be pointed out. They are using their bullshit tflops metrics to state that their 60 tflops 4080 is 5x faster than the xsx which is true when counting tflops but the performance is roughly 2x-2.5x. Maybe a little more in ray traced games, but we know that the XSX performs like a 2080 in most games. Especially in UE5 demos like the Matrix.
Even the 4090 is less than 200% faster than the xsx which would put it below 3x.
Nvidia has been fudging their tflops numbers which no longer equate 1:1 with performance like they used to up until the 2000 series. its just a bunch of baloney.
midgen console refreshes are coming and they should reduce this 2-3x gap even more. If nvidia was offering 5x more performnace than consoles at $1,200, i would be there day one.
6000 series vs 3000 series were never a 1:1 with performance though? If Tflops was the measure on raster on those last gen cards, 3090 would single handedly beat the sh1t out of 6900xt. It's baffling people compare Series X gpu to a 2080 or 2080S, its never gonna be apples to apples.Nvidia has been fudging their tflops numbers which no longer equate 1:1 with performance like they used to up until the 2000 series. its just a bunch of baloney.
While you are correct, you're forgetting audio enthusiast can upgrade their equipment more then once in a year regardless of if their current one is good enough.A good TV will last a decade and a good set of speakers will last two or three decades.
Buying a $1000 set of speakers that will last for >20 years is a completely different value proposition than buying a $1000 GPU that performs worse than a $500 console releasing 5 years later.
then who is?this ain't chief.
read the roomthen who is?
To your point, upon release it was actually pretty remarkable how much of a bargain both the XSX and PS5 were for the price. It reminded me of the fabled 360/PS3 era. they were good buys. They're probably still good buys even if only looking at a perf for the money perspective and ignoring all other factors (which isn't very realistic).This is really not meant to start system warring but this was so pathetic. It happened during their CES 2023 presentation earlier today.
Remember over two years right when consoles were getting too much buzz for Jensen's liking? He introduced RTX IO which over 2 years later is still MIA. He also showed off some laptops that were more powerful than the consoles but the cheapest one was like $3000.
This year, they were at it again with their Ada-powered laptop. "2x the performance of the Playstation 5". OK, but it's also 4x the price and is coming out 2 years after so I fucking hope it's much more powerful.
This is ironic coming from the company that continuously prices out more and more gamers out of good graphics cards. If they think $2000 laptops will entice people to buy them and move away from consoles, I got bad news for them. If you just want mid-range gaming, consoles are becoming more and more attractive options. Meanwhile in the PC space, the mid-range is as much as $800 and will probably start at $600 for an RTX 4070.
Thank you, NVIDIOT.
All according to plan.
You actually don't own your games salty boy its a license. Outside of that your physical hardware is nothing but obsolete within months. You'll be eating crow within the next few years anyways my plan is not to spend $$$ like an idiot.
The irony is that you still need to own games in order to use the service.
Let me put it kindly, I'd rather be a "salty boy" than be a mug.
Each to their own though, they need obedient short sighted people to take things where they want to.
You don't own your games anyways you "mug" even though its spelled out for you. Also, its the right approach for a Cloud service because then many storefronts can be accessed. Short sightedness is thinking that conventional technology will stay consistent to contemporary ones. Like I said, the tech is already here, short sightedness is not taking the chance to see what is being offered. Do you have a 4080? If yes, then fine, you're still a "mug" for spending thousands of $. Other than that I will be enjoying playing on my 4080 Tier Enterprise with all the benefits before you salties .
There are more reasons to buy a console then price.people who wants to buy console cant afford these ridiculous PC hardware anyway LOL
It's hilarious that you're bragging to me about getting cloud computer access that's reliant on a stable Internet connection and as such the quality of your experience will change minute to minute, hour to hour, day to day. None of these are problems that I or any sane person will ever have to deal with.
If your only concern is "saving money" (when in fact you're not, you're literally giving it away and getting nothing in return), then congratulations, you're getting what you pay for.
I have used it. Very impressive, much more then XCloud. Basically almost as good as native, only 10ms of extra latency. Problem is I have 1Gbps, live in the same city as the servers and have a wired connection to my router. Not everyone is as lucky as me, and as distances increase the experience degrades.Nvidia 3080 Tier is $99 every 6 months. Try it, you don't need expensive hardware anymore, just a good display and a bomb ass internet connection with 25 MB minimum. Don't listen to the naysayers, try it yourselves.
So by this you mean you want games to be made to cater for like a 3090 or a 4080 so majority of the gamers won't be able to play those games?Also, these boxes are bottlenecking gaming in the first place
People are gonna hit you with the triggered emoji and shit on this post but I actually agree.You're literally saving on the cost upfront of $799 and then the rest of the computer. Let's say a high end one for $2300+. Not to mention your electricity bill due to power consumption.
Also as @Leo9 mentioned, the 3080 Tier came out last year, this year they are giving out the 4080 Tier for free if you were a Founder. Which means every year and a half you may also be upgraded with a new SuperPod. The 4090 is also $99. You tell me if its not worth it versus a $3500 PC which will also depreciate in value.
Also 240 FPS Stream, RTX, DLSS 3, WideScreen Support, 4K Enhanced on PC App, Mobility on any screen, Region Free when you're moving around and Nvidia Reflex. The value proposition is insane minus the catalog of games. I don't know why people are not being economic. Really try the service, its amazing.
Between this and my M1 Mac Mini on Parallels I get everything I need.