I often get the idea in these discussions that there is a logical error in the way humans are built that makes problems everyone in the same situation suffers far more acceptable than one ones that are unique to you, even if the unique ones are less severe.
It's the same when someone argues that they will buy a console version if their PC can't run the PC version at Ultra / 60 FPS, even though the console version inherently runs at the equivalent of Medium / 30 FPS.
It doesn't make any sense to me, but clearly the impetus is there.
Here's how I view it, and it might play exactly into what you described:
The few times I've taken the foray into PC gaming, my entire goal for building that PC was to ensure that I would have better-than-console performance in the same types of games that I traditionally play on console,
specifically, higher settings+60fps, with no sacrifice made to that 60fps target, and to experience that level of performance for an extended period of time (say, a console generation's worth of time -- 4ish years).
Perhaps it can be chalked up to ignorance when it comes to interpreting the performance cost of certain settings, and the advances in some of those settings that occur over time, but the moment that 60fps mark can't be hit and the settings in games no longer register "high" or "normal," it starts to immediately feel like the gaming PC is inadequate at the task it was built for in my mind. And usually, I'll try and pick parts that might futureproof the machine into being able to achieve that, and that raises the cost of the machine itself.
I mean, my first machine was relatively recently (2012), but I packed that thing with an i5-2500k and a GTX 680, which were both really high quality parts, and the GTX 680 was alone more than a PS3 or Xbox 360 at the time. The moment I couldn't max out BioShock Infinite (a year after I built it) and maintain 60fps, it was the moment I felt like I spent too much money building a machine that couldn't quite achieve what I felt I had built it for, but again, I was completely ignorant of the level of settings it was operating at... But I was under the impression that the GTX 680 would be able to handily smoke a game built for console. It was more or less a misunderstanding of optimization and the PC space getting much more advanced, taxing settings, rather than any sort of linear scaling of performance and power of GPUs, CPUs, etc...
It's like telling PS4 Pro owners that your GTX 1080 or Titan X Pascal GPU looks phenomenal at 4K but then undermining the value of checkerboarding for not being true 4K resolution. A lot of PS4 Pro owners probably just barely scraped together the money to be able to play anything at any semblance of 4K, while PC gamers have been spending TONS of money to be able to do it natively for a while now. I think there's a complete leap of understanding that the cost of maintaining what most people will perceive as a significant performance/presentation output is exceedingly high, and no one wants to build a complicated, multi-functioning machine even at the same cost of console just to play games -- because it complicates the experience of playing the games beyond what they desire to deal with.
I like PC gaming but I don't like trying to keep up with it. It feels like it provides the "better than console" space because it isn't afraid to keep tacking premium pricing and more complex systems on top of all of that. I don't even remotely have a PC that can function as a 4K machine, and I don't have displays that achieve 120Hz or 144Hz refresh, so I have no idea what that's even like, but I look at the cost at what it takes to achieve either of those metrics (even just buying a 120Hz monitor and playing something low-requirement) and it's just too much to sell me on paper alone. Maybe if I could spend some time playing around with someone's 120Hz+ monitor in one of my favorite games to actually know what that's like, but I seem to be having a fine time at just 60Hz right now.
I'd be willing to bet that iterative console updates like the PS4 Pro are testing to see if the market can keep the premium pricing portion and remove the complexity of having to interface through a full PC operating system. However, to most, it seems that even just a $400 PS4 Pro is out of the question for most people who already bought a PS4, it seems, at least at the current level of enticement from the actual content. Either don't have a 4K display, or just don't see anything getting a significant update by playing a Pro to justify the cost. Why would they think anything different about PC from a gaming perspective, PLUS the complexity of using a PC to play the games?
I keep disconnecting from PC gaming because what I want it to achieve always just seems too costly. I always feel like I spend more money than I spend on my consoles just to get the machine in the first place, only to have PC gamers move onto the next high refresh rate or the next super taxing resolution while barely switching settings below "High" presets and I'm over here barely maintaining "medium" settings at 1080p/60Hz and wondering if my PC gaming is actually PC gaming, or if I'm just spending so much damn money to have essentially the same exact experience that satisfies me on console. So having dipped my toes into PC, I'm like "well of COURSE I'm only going to get 30fps/medium because I'm playing on my PS4 and not on my PC, but at least I don't feel inadequate that my $1000+ machine isn't quite reaching 120fps when I take vsync off!... not that I have a 120Hz monitor anyway."
and an aside: thanks for all your do for the gaming community Durante, never really thanked you before, but I used your DSfix and linked my buddy your Tales fix and it's been nothing but great.