• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA's dig at console and marketing is cringe

If it was I would tell you. You should check out professional reviews where the 3080 Tier is better than a Physical Series X in terms of Input Latency on some games but not a local PC but the change is so Minimal. Tom's Hardware also did an indepth review and so did Digital Foundry. Turn on 120 Hz with games that support Nvidia Reflex and see the magic. Unless you're a full on esports gamer you literally won't see the ms difference so why even pretend? LDAT tests also prove its proficiency.



I am a full on esports gamer. I play CS to a high level so obviously I wouldn't use it for that ever. However I do find the tech interesting and 120hz with minimal latency would be good enough for me for single player games. I still think these tests are done in perfect environments so I'm skeptical but it's good to see how the tech is coming along.
 

T4keD0wN

Member
Well, youre right. They are a terrible company, just like every other company. I dont expect them to be able to sustain their crazy pricing for much longer.
Them using DLSS3 FG for their insane performance claims is even more pathetic.
But to be entirely fair the consoles are useless without buying their 65eur/year online subscribtion bs. Why does nobody add it to the price when discussing the cost? Genius move to exploit consumers by making their products appear cheaper on console makers part since everyone just seemingly ignores it. You should also keep in mind that gpus can do way more things than just render games and that nvidia doesnt run a game store where they take 30% cut on every sale and they make money by selling gpus.
 
Last edited:

T4keD0wN

Member
Their tech isn’t ‘years ahead’ of AMD, and they certainly didn’t have anything better for consumers at console price points back in 2020.
Look at DLSS2 release date and then look at FSR 2 release date.
Look at ampere (2020) ray tracing performance and then look at rdna 3 (2022) ray tracing performance.
Look at first real time ray tracing capable cards (2018) vs amds (2020)
Look at CUDA and look at no CUDA.
Look at nvenc and look at no nvenc.
Look at DLSS3 release and look at FSR 3 release date.
Look at rtx remix and look at no rtx remix.
Look at rtx voice release date and AMD Noise Suppression release date.
Look at nvidia drivers history and look at amd drivers history.

While amds cards are great (still behind) at rasterisation, they are years behind in features. All they do is playing catch up.
I wish amd were better so the market wouldnt be in such a terrible state.
 
Last edited:

winjer

Member
Look at DLSS2 release date and then look at FSR 2 release date.

DLSS was released before FSR2. But DLSS was not the first Temporal upscaler.
For example, Epic's TAAU was introduced in UE4 in 2018. 2 years before DLSS 2.0
NVidia does get the credit for being the first to use ML, to smooth out the end result.

Look at CUDA and look at no CUDA

GPGPU has existed for a long time. AMD decided to go the Open source route, even before NVidia introduced CUDA.
NVidia did an amazing job at supporting CUDA, and locking everyone out or their ecosystem. But they didn't start GPGPU.

Look at DLSS3 release and look at FSR 3 release date

NVidia was not the first to use interpolation to generate new frames. TVs have been doing it for many years before.
Like DLSSS, they are the first to use ML to try to improve the final result.

Look at nvidia drivers history and look at amd drivers history.

The issues with AMD drivers have been gravely overblow by NVidia fanboys.
As someone who has really used GPUs from both brands, I can say they are not that far apart.
And in recent times AMD has improved them to the point where they are as good, if not better than NVidia.
As someone who recently got into AMD, after almost a decade with only NVida GPUs, I was surprise at how much better the Control Panel on Radeon is, compared to NVidia.
Nvidia's control panel is like something pulled from the 90's.

While amds tech is great at rasterisation, they are years behind in features.

AMD is behind in RT and ML. But you are ignoring all the innovations that AMD brought in recent years.
Low level APIs, was work was done by AMD. Improving performance significantly by having much better management of draw calls and much better parallelization of threads.
AMD was the first with Async compute, something that significantly improves performance. For several generations, A-sync on NVidia GPUs either dropped performance or gave nothing at all. It took NVidia until Turing to catch up to AMD.
AMD created freesync, based off the VESA standard, at a time when NVidia was pushing a 150$ G-sync module. Today G-Sync is dead and almost all monitors and TVs use AMD´s Freesync. NVidia does get the credit of being able to hijack the marketing of Freesync and calling it G-sync-compatible.
AMD still has the better fron-end in a GPU, being able to have a lower overhead in drivers. NVidia is still doing a lot of stuff in software with their drivers.
AMD was also first with frame interpolation, although this was only for video. It was called Fluid Motion, and it might just be the base for FSR3. Maybe.
 

TGO

Hype Train conductor. Works harder than it steams.
people who wants to buy console cant afford these ridiculous PC hardware anyway LOL
Most*
A good number of them spend more then that on audio and TV equipment.
Shit some have speakers that cost more then that.
 

GHG

Member
Nvidia 3080 Tier is $99 every 6 months. Try it, you don't need expensive hardware anymore, just a good display and a bomb ass internet connection with 25 MB minimum. Don't listen to the naysayers, try it yourselves.

All according to plan.

1*1Rfb8yCpOKAVIvijCHp9mw.png
 

ToTTenTranz

Banned
Most*
A good number of them spend more then that on audio and TV equipment.
Shit some have speakers that cost more then that.
A good TV will last a decade and a good set of speakers will last two or three decades.
Buying a $1000 set of speakers that will last for >20 years is a completely different value proposition than buying a $1000 GPU that performs worse than a $500 console releasing 5 years later.
 

T4keD0wN

Member
DLSS was released before FSR2. But DLSS was not the first Temporal upscaler.
For example, Epic's TAAU was introduced in UE4 in 2018. 2 years before DLSS 2.0
NVidia does get the credit for being the first to use ML, to smooth out the end result.
Never said they were the first or the only ones in anything, just that they have been earlier than amd and that their results are better as i get better visuals and slightly higher framerates with dlss compared to fsr adn xess which are still better than taau gen5. Theres also a lot more and better games with dlss2+ than with fsr2. The AMD blocking DLSS from their sponsored titles conspiracy that some people have came up with is hilarious. I want to know if its true or not as either result would be funny as hell.
NVidia was not the first to use interpolation to generate new frames. TVs have been doing it for many years before.

Like DLSSS, they are the first to use ML to try to improve the final result.
Again, never said they were the first or the only ones in anything just better. I use topaz labs a lot and it takes half a day to interpolate 1h 30fps to 60fps video there and the results are not as good as dlss3 which is done in real-time (although i can run 3 different insances which cuts the time down a lot). TV interpolation is a joke everyone agrees on. AMD have not released fsr3 so i have nothing to compare it to which reinforces the argument of them "being behind".

The issues with AMD drivers have been gravely overblow by NVidia fanboys.
As someone who has really used GPUs from both brands, I can say they are not that far apart.
And in recent times AMD has improved them to the point where they are as good, if not better than NVidia.
As someone who recently got into AMD, after almost a decade with only NVida GPUs, I was surprise at how much better the Control Panel on Radeon is, compared to NVidia.
Nvidia's control panel is like something pulled from the 90's.
I believe they are in very usable state compared to early rdna1 times, but they are still not as good as nvidia. Close enough though. Amd have been pretty bad at older APIs for a long time.
I hate Nvidia control panel. Its god awful, you are 100% right. Adrenaline is nicer, but i like the customization of nvidia profile inspector. Ultimately doesnt matter that much since i rarely use them and when i do they both work which is their whole point.
Not driver related, but did you see what is happening with 7900xt(x) currently?
AMD created freesync, based off the VESA standard, at a time when NVidia was pushing a 150$ G-sync module. Today G-Sync is dead and almost all monitors and TVs use AMD´s Freesync. NVidia does get the credit of being able to hijack the marketing of Freesync and calling it G-sync-compatible.
I know, love my freesync monitor. Never had a g-sync module one which were apparently supposed to perform a bit better, but that is very model specific i guess and i wouldnt be surprised if the newer freesync ones are better than old g-sync module ones. They are not worth the price imo and majority thankfully uses freesync if we dont count TVs which i dont really care about.
AMD still has the better fron-end in a GPU, being able to have a lower overhead in drivers. NVidia is still doing a lot of stuff in software with their drivers.
The CPU scheduler overhead is a big deal to me since i end up cpu bottlenecked in a lot of games even with alder lake, it is not talked about enough. I dont know if they adressed this in the last year because its been quite a while since i followed this driver scheduling topic.

You forgot AMDs SAM/REBAR which is cool.

I think both companies are very scummy nowadays. Nvidia way more, although amd is trying to catch up after years of pretending of being a nice pro consumer company, but their producs are better.
 
Last edited:

V1LÆM

Gold Member
Nvidia can fuck right off after their stupid ass pricing for all the 40 series cards.
if only there was proper competition to keep Nvidia in check.

i'm one of the fools paying for a 4080 but AMD kinda made it look like a good deal 🤷‍♂️ last time i checked the price of a 7900 XTX it was £1,050. right now the only available model i can find is £1,159.

i paid £1,199 for my 4080. that extra £40 got me similar raster performance, better RT performance, DLSS 3.0 support. totally worth it even if the difference was still £150. also the new AMD cards have a manufacturing issue where they are running 100C or higher due to an issue with the heatsinks or something. the 7900 XTX has a higher TDP than the 4080 too by about 35W. not a big difference but still worth noting.

i'm all for competition and i'd love nothing more than to have another AMD card but they can't really compete with Nvidia with their Raytracing/AI stuff or their top end cards. the best we got from AMD is a card that goes up against the 4080 in raster peformance. i don't think they are bringing out a card to go against the 4090 or rumored 4090 Ti. Nvidia can basically charge what they want for the 4090 and again the 4080 seems like a good deal. the 7900 XTX should be £900-£1,000.

Intel don't even start to light a fire under Nvidia. They need to try catch up to AMD first lol.
 
All the jabs are meaningless anyway because all that hardware power on PC is used for insignificant trash like 4k res and raytracing in the same games you can play on consoles, only slightly worse looking.

I remember the good old days where the expensive hardware could run entire games that just were not possible on consoles. That was awesome. I, for the life of me, cannot understand why someone would pay these insane prices just to get a few extra pixels and frames for the same games.
 

hlm666

Member
I know, love my freesync monitor. Never had a g-sync module one which were apparently supposed to perform a bit better, but that is very model specific i guess and i wouldnt be surprised if the newer freesync ones are better than old g-sync module ones. They are not worth the price imo and majority thankfully uses freesync if we dont count TVs which i dont really care about.
They had to because the only displays that included the parts of the vesa standard to do vrr were in laptops for power saving features. It's why AMD demoed it on laptops for over a year before a desktop monitor come to market with the functionality. It was also worse for a long time with bad operating frequency windows with shit like 50-90hz ranges when every gsync display was doing 30-max rafresh and only worked over hdmi when display port was an important consideration at the time. Yes it's fine now many years later but this is not one of nvidias typical we did it to fuck other companies moves, there were some valid reasons for the module.
 

JackSparr0w

Banned
GeForce Now RTX 4080 utterly destroys consoles in terms of value especially in Europe where you save on energy getting a 4080 rig essentially for free.
 

spons

Gold Member
It's just gaming industry infighting, you know, like we've had for decades now. Pointless and maybe even harmful to the entire sector.
 

winjer

Member
Never said they were the first or the only ones in anything, just that they have been earlier than amd and that their results are better as i get better visuals and slightly higher framerates with dlss compared to fsr adn xess which are still better than taau gen5. Theres also a lot more and better games with dlss2+ than with fsr2. The AMD blocking DLSS from their sponsored titles conspiracy that some people have came up with is hilarious. I want to know if its true or not as either result would be funny as hell.

There is a long story with NVidia and AMD sponsored titles excluding tech from the other brand.
So I wouldn't be surprised if that continues to happen with FSR and DLSS.

Again, never said they were the first or the only ones in anything just better. I use topaz labs a lot and it takes half a day to interpolate 1h 30fps to 60fps video there and the results are not as good as dlss3 which is done in real-time (although i can run 3 different insances which cuts the time down a lot). TV interpolation is a joke everyone agrees on. AMD have not released fsr3 so i have nothing to compare it to which reinforces the argument of them "being behind".

If you are considering video, then AMD was first to frame interpolation with Fluid Motion. NVidia was the first only with gaming.

I believe they are in very usable state compared to early rdna1 times, but they are still not as good as nvidia. Close enough though. Amd have been pretty bad at older APIs for a long time.
I hate Nvidia control panel. Its god awful, you are 100% right. Adrenaline is nicer, but i like the customization of nvidia profile inspector. Ultimately doesnt matter that much since i rarely use them and when i do they both work which is their whole point.
Not driver related, but did you see what is happening with 7900xt(x) currently?

The thing with the 7900 is a matter of a bad batch of vapor chambers. Not a design flaw from AMD.
Truth be told, neither AMD, nor NVidia, nor Intel, make their vapor chambers or coolers. They hire other companies to make them.
So a defect like the one on AMD's vapor chamber can happen to any company.

The issue with older APIs came with GCN, an architecture made to break away from high level APIs, and usher an age of low level APIs.
If you look at the 5000 series, they were very competitive with anything NVidia had. In fact, in power efficiency, AMD was miles ahead of NVida's Fermi.
But DX12 and Vulkan adoption was very slow, and AMD suffered a lot because of that.
Still, the issues with AMD drivers at the time were very over bloated. I had a HD7950, GTX680, GTX970 and R390, during that time. And I did not have many more issues with one brand or another.
AMD was in fact a bit slower in optimized drivers for games. But this was mostly due to deals with some games where one brand or another would block the dev from providing a sample of the game for the company to optimize the game.
NVidia was notorious in this dealings, since many games were blocked from AMD during developemt. But AMD also did a few. I remember that Tomb Raider was a sponsored game game by AMD, and they blocked NVidia from getting the game until release. For several weeks, the game was almost unplayable on NVidia hardware.
In the end, these shenanigans only hurt consumers at the time.

The CPU scheduler overhead is a big deal to me since i end up cpu bottlenecked in a lot of games even with alder lake, it is not talked about enough. I dont know if they adressed this in the last year because its been quite a while since i followed this driver scheduling topic.

You probably noticed how AMD GPUs perform better at lower resolutions. There are a few games that at 1080p, a 7900XTX matches a 4090. But then at 4K, it can only match the 4080.
This is because at lower resolutions, games get CPU bound and in these cases the driver overhead plays a significant impact.
This is also the reason why you'll see some NVidia fans demanding all reviews to be made with 13900K, with the fastest memory and with OC.

You forgot AMDs SAM/REBAR which is cool.

REBAR is really cool. It gives a nice boost in performance, on several games for AMD and Intel GPUs.
But it seems an abandoned tech for NVidia. Fortunately, NVidia users can enable it for most games using NVInspector.
BTW, NVInspector is not made by NVidia, but by fans. So it doesn't really count as an innovation from NVidia.
AMD did have a couple of similar apps several years ago, but they are long gone. Sadly, there isn't anything as good on the AMD side.

I think both companies are very scummy nowadays. Nvidia way more, although amd is trying to catch up after years of pretending of being a nice pro consumer company, but their producs are better.

Yes, both companies are price gouging consumers hard. AMD only has slightly lower prices, because they can't push them as higher. But they would, if they could.
 

SlimySnake

Flashless at the Golden Globes
Sounds like he ruffled your feathers. Nvidias pricing is absurd these days but their tech is years ahead of the competition..
Nvidia straight up making misleading console war statements is lame and should be pointed out. They are using their bullshit tflops metrics to state that their 60 tflops 4080 is 5x faster than the xsx which is true when counting tflops but the performance is roughly 2x-2.5x. Maybe a little more in ray traced games, but we know that the XSX performs like a 2080 in most games. Especially in UE5 demos like the Matrix.

CscJ0wY.jpg


Even the 4090 is less than 200% faster than the xsx which would put it below 3x.

Nvidia has been fudging their tflops numbers which no longer equate 1:1 with performance like they used to up until the 2000 series. its just a bunch of baloney.

midgen console refreshes are coming and they should reduce this 2-3x gap even more. If nvidia was offering 5x more performnace than consoles at $1,200, i would be there day one.
 

Gaiff

SBI’s Resident Gaslighter
Nvidia straight up making misleading console war statements is lame and should be pointed out. They are using their bullshit tflops metrics to state that their 60 tflops 4080 is 5x faster than the xsx which is true when counting tflops but the performance is roughly 2x-2.5x. Maybe a little more in ray traced games, but we know that the XSX performs like a 2080 in most games. Especially in UE5 demos like the Matrix.

CscJ0wY.jpg


Even the 4090 is less than 200% faster than the xsx which would put it below 3x.

Nvidia has been fudging their tflops numbers which no longer equate 1:1 with performance like they used to up until the 2000 series. its just a bunch of baloney.

midgen console refreshes are coming and they should reduce this 2-3x gap even more. If nvidia was offering 5x more performnace than consoles at $1,200, i would be there day one.
Those numbers are either 1080p or completely wrong. The 4090 is actually quite a bit more than 3x faster than the XSX.

image.png


Here it's already 2.62x faster than the 3070 which is around 30% faster than the 2080. The 4090 is in the ballpark of 3.3x faster than the XSX in raster and 4-5x in ray tracing.
 

clampzyn

Member
Nvidia has been fudging their tflops numbers which no longer equate 1:1 with performance like they used to up until the 2000 series. its just a bunch of baloney.
6000 series vs 3000 series were never a 1:1 with performance though? If Tflops was the measure on raster on those last gen cards, 3090 would single handedly beat the sh1t out of 6900xt. It's baffling people compare Series X gpu to a 2080 or 2080S, its never gonna be apples to apples.
 

TGO

Hype Train conductor. Works harder than it steams.
A good TV will last a decade and a good set of speakers will last two or three decades.
Buying a $1000 set of speakers that will last for >20 years is a completely different value proposition than buying a $1000 GPU that performs worse than a $500 console releasing 5 years later.
While you are correct, you're forgetting audio enthusiast can upgrade their equipment more then once in a year regardless of if their current one is good enough.
There's always something better and new.
I mean I've had two subs in a matter of a year and planning to upgrade my fronts this year too and possibly get a secondary sub to match the new one.
And I've had 3 TVs in the last 8 years
My point is you shouldn't assume just because someone has a console it means they can't afford a PC.
Most even have a PC for other things, just not gaming.
 

Razvedka

Banned
This is really not meant to start system warring but this was so pathetic. It happened during their CES 2023 presentation earlier today.

Remember over two years right when consoles were getting too much buzz for Jensen's liking? He introduced RTX IO which over 2 years later is still MIA. He also showed off some laptops that were more powerful than the consoles but the cheapest one was like $3000.

This year, they were at it again with their Ada-powered laptop. "2x the performance of the Playstation 5". OK, but it's also 4x the price and is coming out 2 years after so I fucking hope it's much more powerful.

This is ironic coming from the company that continuously prices out more and more gamers out of good graphics cards. If they think $2000 laptops will entice people to buy them and move away from consoles, I got bad news for them. If you just want mid-range gaming, consoles are becoming more and more attractive options. Meanwhile in the PC space, the mid-range is as much as $800 and will probably start at $600 for an RTX 4070.

Thank you, NVIDIOT.
To your point, upon release it was actually pretty remarkable how much of a bargain both the XSX and PS5 were for the price. It reminded me of the fabled 360/PS3 era. they were good buys. They're probably still good buys even if only looking at a perf for the money perspective and ignoring all other factors (which isn't very realistic).
 

Fahdis

Member
All according to plan.

1*1Rfb8yCpOKAVIvijCHp9mw.png

You actually don't own your games salty boy ;) its a license. Outside of that your physical hardware is nothing but obsolete within months or a year while it depreciates in value. You'll be eating crow within the next few years anyways while you play on the cloud 😂 my plan is not to spend $$$ like an idiot.

- Pays for his internet through an ISP
- Buys a Licence to a Game
- If on PC, buys a license to an OS
- Probably pays for a Console Online Service
- Has to sigh a EULA anyways to use the Box he has which can be turned off by the company from Online Services if caught with malignancy.

Your meme literally applies to yourself.
 
Last edited:

GHG

Member
You actually don't own your games salty boy ;) its a license. Outside of that your physical hardware is nothing but obsolete within months. You'll be eating crow within the next few years anyways 😂 my plan is not to spend $$$ like an idiot.

The irony is that you still need to own games in order to use the service.

Let me put it kindly, I'd rather be a "salty boy" than be a mug.

Each to their own though, they need obedient short sighted people to take things where they want to.
 

Fahdis

Member
The irony is that you still need to own games in order to use the service.

Let me put it kindly, I'd rather be a "salty boy" than be a mug.

Each to their own though, they need obedient short sighted people to take things where they want to.

You don't own your games anyways you "mug" ;) even though its spelled out for you. Also, its the right approach for a Cloud service because then many storefronts can be accessed. Short sightedness is thinking that conventional technology will stay consistent to contemporary ones. Like I said, the tech is already here, short sightedness is not taking the chance to see what is being offered. Do you have a 4080? If yes, then fine, you're still a "mug" for spending thousands of $. Other than that I will be enjoying playing on my 4080 Tier Enterprise with all the benefits before you salties ;).
 

GHG

Member
You don't own your games anyways you "mug" ;) even though its spelled out for you. Also, its the right approach for a Cloud service because then many storefronts can be accessed. Short sightedness is thinking that conventional technology will stay consistent to contemporary ones. Like I said, the tech is already here, short sightedness is not taking the chance to see what is being offered. Do you have a 4080? If yes, then fine, you're still a "mug" for spending thousands of $. Other than that I will be enjoying playing on my 4080 Tier Enterprise with all the benefits before you salties ;).

It's hilarious that you're bragging to me about getting cloud computer access that's reliant on a stable Internet connection and as such the quality of your experience will change minute to minute, hour to hour, day to day. None of these are problems that I or any sane person will ever have to deal with.

If your only concern is "saving money" (when in fact you're not, you're literally giving it away and getting nothing in return), then congratulations, you're getting what you pay for.
 

Fahdis

Member
It's hilarious that you're bragging to me about getting cloud computer access that's reliant on a stable Internet connection and as such the quality of your experience will change minute to minute, hour to hour, day to day. None of these are problems that I or any sane person will ever have to deal with.

If your only concern is "saving money" (when in fact you're not, you're literally giving it away and getting nothing in return), then congratulations, you're getting what you pay for.

Are you saying spending $1500+ is the chad move for just a GPU? I can afford it easily but its a dumb economical move and just like consoles, PC's are not efficient from a business aspect when you can leverage the uptime on the Cloud by sharing the servertime and partition the instance on a VM. Why the hell do you think every app is getting on there with AWS or Azure or Google Cloud? Also, these boxes are bottlenecking gaming in the first place.

Also what a stupid take on a wired ethernet gigabyte connection (which of course I have) perhaps your situation is crap but as said before a 25 MB connection in a 3rd World Country can give access to games due to regional pricing. And besides if you had used the service on a higher tier, you can customize your bitrate. As per your input anything can happen serverside when you're playing your games online 😂 that's if you exclusively play online games anyways you're one of those people who can see but is blind anyways. Let's end our interaction here.

You have no flex. Hold onto your box with the games that you think you own and stay there. I like how you give selective replies with no actual answers as well.
 

Zathalus

Member
Nvidia 3080 Tier is $99 every 6 months. Try it, you don't need expensive hardware anymore, just a good display and a bomb ass internet connection with 25 MB minimum. Don't listen to the naysayers, try it yourselves.
I have used it. Very impressive, much more then XCloud. Basically almost as good as native, only 10ms of extra latency. Problem is I have 1Gbps, live in the same city as the servers and have a wired connection to my router. Not everyone is as lucky as me, and as distances increase the experience degrades.

I'm looking forward to trying the 4080 tier, as I won't have access to a 40xx card until the end of the year.
 

lukilladog

Member
One way to bother nvidia and still get a decent card is going second hand and pass from games made by their shills (at least for a year)... you don't have to dance at the pace they tell you to.
 
Last edited:

clampzyn

Member
Also, these boxes are bottlenecking gaming in the first place
So by this you mean you want games to be made to cater for like a 3090 or a 4080 so majority of the gamers won't be able to play those games?
 
You're literally saving on the cost upfront of $799 and then the rest of the computer. Let's say a high end one for $2300+. Not to mention your electricity bill due to power consumption.

Also as @Leo9 mentioned, the 3080 Tier came out last year, this year they are giving out the 4080 Tier for free if you were a Founder. Which means every year and a half you may also be upgraded with a new SuperPod. The 4090 is also $99. You tell me if its not worth it versus a $3500 PC which will also depreciate in value.

Also 240 FPS Stream, RTX, DLSS 3, WideScreen Support, 4K Enhanced on PC App, Mobility on any screen, Region Free when you're moving around and Nvidia Reflex. The value proposition is insane minus the catalog of games. I don't know why people are not being economic. Really try the service, its amazing.

Between this and my M1 Mac Mini on Parallels I get everything I need.
People are gonna hit you with the triggered emoji and shit on this post but I actually agree.

The biggest problem with GFN is the library which gets better and better every day but it's still limited which I understand because of licensing however for me it's actually been great. When it comes to general computing I much prefer Mac over Windows PCs despite using PCs and playing games on PC since I was a little kid in the 90s so being able to play most of the games I play on my PC (ITX build with a 5600x and 3060ti) at 1600p 120fps on my MBP14's Mini LED display is amazing. I dock my MacBook at my desk when I'm just browsing around and sometimes I find myself booting up GFN versus switching over to my PC even though it's within arms reach.

I do have AT&T fiber so I'm sure that gives me a much better experience as I get 500 down.
 
Top Bottom